Mar 18 15:34:09 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 15:34:10 crc restorecon[4691]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 15:34:10 crc restorecon[4691]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 15:34:11 crc kubenswrapper[4792]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:34:11 crc kubenswrapper[4792]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 15:34:11 crc kubenswrapper[4792]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:34:11 crc kubenswrapper[4792]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:34:11 crc kubenswrapper[4792]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 15:34:11 crc kubenswrapper[4792]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.583530 4792 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591201 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591233 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591243 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591271 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591281 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591289 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591297 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591305 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591315 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591324 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591332 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591340 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591347 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591355 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591363 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591370 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591378 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591386 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591394 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591402 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591410 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591418 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591425 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591432 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591440 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591450 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591460 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591469 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591478 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591487 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591497 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591505 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591513 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591521 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591530 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591538 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591545 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591553 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591560 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591569 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591577 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591585 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591594 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591602 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591609 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591617 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591625 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591634 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591644 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591653 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591661 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591670 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591678 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591685 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591693 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591701 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591709 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591716 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591724 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591731 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591739 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591747 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591754 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591763 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591771 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591778 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591786 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591794 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591801 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591808 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.591817 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592065 4792 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592083 4792 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592096 4792 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592106 4792 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592139 4792 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592148 4792 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592161 4792 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592171 4792 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592180 4792 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592189 4792 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592199 4792 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592208 4792 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592217 4792 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592226 4792 flags.go:64] FLAG: --cgroup-root="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592234 4792 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592243 4792 flags.go:64] FLAG: --client-ca-file="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592252 4792 flags.go:64] FLAG: --cloud-config="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592261 4792 flags.go:64] FLAG: --cloud-provider="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592269 4792 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592280 4792 flags.go:64] FLAG: --cluster-domain="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592289 4792 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592298 4792 flags.go:64] FLAG: --config-dir="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592307 4792 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592316 4792 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592327 4792 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592335 4792 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592344 4792 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592354 4792 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592363 4792 flags.go:64] FLAG: --contention-profiling="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592371 4792 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592380 4792 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592389 4792 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592399 4792 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592410 4792 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592419 4792 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592428 4792 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592437 4792 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592446 4792 flags.go:64] FLAG: --enable-server="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592454 4792 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592465 4792 flags.go:64] FLAG: --event-burst="100" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592473 4792 flags.go:64] FLAG: --event-qps="50" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592482 4792 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592491 4792 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592500 4792 flags.go:64] FLAG: --eviction-hard="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592510 4792 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592519 4792 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592528 4792 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592537 4792 flags.go:64] FLAG: --eviction-soft="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592546 4792 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592554 4792 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592563 4792 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592572 4792 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592580 4792 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592589 4792 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592598 4792 flags.go:64] FLAG: --feature-gates="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592609 4792 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592618 4792 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592627 4792 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592636 4792 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592645 4792 flags.go:64] FLAG: --healthz-port="10248" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592654 4792 flags.go:64] FLAG: --help="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592662 4792 flags.go:64] FLAG: --hostname-override="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592671 4792 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592679 4792 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592690 4792 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592699 4792 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592708 4792 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592717 4792 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592726 4792 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592735 4792 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592743 4792 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592753 4792 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592762 4792 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592770 4792 flags.go:64] FLAG: --kube-reserved="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592779 4792 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592788 4792 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592797 4792 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592833 4792 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592843 4792 flags.go:64] FLAG: --lock-file="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592851 4792 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592860 4792 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592869 4792 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592882 4792 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592891 4792 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592900 4792 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592909 4792 flags.go:64] FLAG: --logging-format="text" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592917 4792 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592927 4792 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592936 4792 flags.go:64] FLAG: --manifest-url="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592944 4792 flags.go:64] FLAG: --manifest-url-header="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592955 4792 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.592964 4792 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593007 4792 flags.go:64] FLAG: --max-pods="110" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593017 4792 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593025 4792 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593034 4792 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593043 4792 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593052 4792 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593061 4792 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593070 4792 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593088 4792 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593097 4792 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593105 4792 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593114 4792 flags.go:64] FLAG: --pod-cidr="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593123 4792 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593136 4792 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593145 4792 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593155 4792 flags.go:64] FLAG: --pods-per-core="0" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593163 4792 flags.go:64] FLAG: --port="10250" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593172 4792 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593181 4792 flags.go:64] FLAG: --provider-id="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593190 4792 flags.go:64] FLAG: --qos-reserved="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593198 4792 flags.go:64] FLAG: --read-only-port="10255" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593207 4792 flags.go:64] FLAG: --register-node="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593216 4792 flags.go:64] FLAG: --register-schedulable="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593225 4792 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593239 4792 flags.go:64] FLAG: --registry-burst="10" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593248 4792 flags.go:64] FLAG: --registry-qps="5" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593256 4792 flags.go:64] FLAG: --reserved-cpus="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593265 4792 flags.go:64] FLAG: --reserved-memory="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593276 4792 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593284 4792 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593293 4792 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593302 4792 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593310 4792 flags.go:64] FLAG: --runonce="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593319 4792 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593328 4792 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593337 4792 flags.go:64] FLAG: --seccomp-default="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593347 4792 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593364 4792 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593374 4792 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593383 4792 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593392 4792 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593400 4792 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593409 4792 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593417 4792 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593426 4792 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593435 4792 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593444 4792 flags.go:64] FLAG: --system-cgroups="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593452 4792 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593465 4792 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593474 4792 flags.go:64] FLAG: --tls-cert-file="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593482 4792 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593494 4792 flags.go:64] FLAG: --tls-min-version="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593502 4792 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593511 4792 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593519 4792 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593528 4792 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593537 4792 flags.go:64] FLAG: --v="2" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593547 4792 flags.go:64] FLAG: --version="false" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593558 4792 flags.go:64] FLAG: --vmodule="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593568 4792 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.593578 4792 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593774 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593784 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593793 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593801 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593810 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593820 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593829 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593839 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593851 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593859 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593868 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593877 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593886 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593895 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593906 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593914 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593922 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593932 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593942 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593952 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593962 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.593994 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594003 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594011 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594019 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594028 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594038 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594046 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594054 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594062 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594070 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594077 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594085 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594093 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594100 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594108 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594117 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594126 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594136 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594144 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594155 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594162 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594170 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594177 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594185 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594192 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594200 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594207 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594215 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594222 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594230 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594238 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594245 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594254 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594261 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594269 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594276 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594284 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594291 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594299 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594307 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.594315 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.596488 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.596594 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.596607 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.596618 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.597068 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.597084 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.597093 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.597102 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.597110 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.597125 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.614576 4792 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.614634 4792 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614755 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614772 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614785 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614794 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614803 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614810 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614820 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614829 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614838 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614846 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614854 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614862 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614870 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614881 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614889 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614899 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614907 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614915 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614924 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614932 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614941 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614949 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614956 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.614965 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615004 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615013 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615022 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615030 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615038 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615046 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615054 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615062 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615069 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615079 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615088 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615097 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615105 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615115 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615125 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615135 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615143 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615151 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615161 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615170 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615179 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615188 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615196 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615204 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615212 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615220 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615228 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615236 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615246 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615254 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615262 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615270 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615278 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615286 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615294 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615302 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615311 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615318 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615327 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615335 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615344 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615352 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615361 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615369 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615377 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615384 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615392 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.615405 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615645 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615657 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615667 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615677 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615686 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615694 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615703 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615712 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615721 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615730 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615739 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615747 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615758 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615769 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615780 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615789 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615797 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615808 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615816 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615825 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615836 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615846 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615856 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615865 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615874 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615882 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615891 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615900 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615908 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615916 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615924 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615932 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615940 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615948 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.615959 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616003 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616083 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616092 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616101 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616109 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616118 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616125 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616134 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616142 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616150 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616158 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616166 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616175 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616183 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616191 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616199 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616207 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616215 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616225 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616233 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616242 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616252 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616262 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616271 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616280 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616288 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616297 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616305 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616312 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616320 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616327 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616336 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616344 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616352 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616360 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.616368 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.616380 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.617354 4792 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.624056 4792 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.629788 4792 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.629964 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.631764 4792 server.go:997] "Starting client certificate rotation" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.631827 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.632076 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.660550 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.663502 4792 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.664436 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.678283 4792 log.go:25] "Validated CRI v1 runtime API" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.723733 4792 log.go:25] "Validated CRI v1 image API" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.726357 4792 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.731148 4792 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-15-29-33-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.731198 4792 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.759705 4792 manager.go:217] Machine: {Timestamp:2026-03-18 15:34:11.756236403 +0000 UTC m=+0.625565410 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:23b8af29-cf8d-424d-a7c3-490f2387f9d8 BootID:85dc16d2-3bef-455f-aff6-17a99cc51456 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:50:58:7d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:50:58:7d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b3:ab:ca Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c3:cf:ab Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:dc:64:cd Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:13:6d:a9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:84:a5:8e:ae:2f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:76:0e:95:dc:ec:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.760144 4792 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.760392 4792 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.760953 4792 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.761150 4792 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.761191 4792 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.761423 4792 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.761434 4792 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.762106 4792 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.762138 4792 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.764874 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.764982 4792 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.768668 4792 kubelet.go:418] "Attempting to sync node with API server" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.768696 4792 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.768748 4792 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.768766 4792 kubelet.go:324] "Adding apiserver pod source" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.768783 4792 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.773399 4792 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.774504 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.774592 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.774615 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.774697 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.775589 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.777255 4792 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779726 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779777 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779793 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779807 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779858 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779873 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779886 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779908 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779923 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779937 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.779997 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.780011 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.780915 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.781701 4792 server.go:1280] "Started kubelet" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.782148 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.782959 4792 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.782954 4792 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.784274 4792 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 15:34:11 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.786215 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.786325 4792 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.786498 4792 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.786529 4792 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.786694 4792 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.787642 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.788197 4792 server.go:460] "Adding debug handlers to kubelet server" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.788306 4792 factory.go:55] Registering systemd factory Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.788369 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.793524 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.793949 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.794078 4792 factory.go:221] Registration of the systemd container factory successfully Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.797996 4792 factory.go:153] Registering CRI-O factory Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.798086 4792 factory.go:221] Registration of the crio container factory successfully Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.798226 4792 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.797353 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189df967a3f69709 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,LastTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.801228 4792 factory.go:103] Registering Raw factory Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.801291 4792 manager.go:1196] Started watching for new ooms in manager Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.803503 4792 manager.go:319] Starting recovery of all containers Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.808903 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809024 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809058 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809084 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809107 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809138 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809166 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809193 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809224 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809253 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809280 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809309 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809335 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809365 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809396 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809426 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809455 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809486 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809514 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809542 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809572 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809815 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809842 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809866 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809892 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.809926 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810034 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810059 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810080 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810098 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810118 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810137 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810159 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810180 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810202 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810221 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810242 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810261 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810280 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810301 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810323 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810341 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810361 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810382 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810401 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810420 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810440 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810461 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810482 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810502 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810519 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810540 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810567 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810588 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810610 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810634 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810653 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810671 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810692 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810710 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810731 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810750 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810771 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810791 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810812 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810831 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810852 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810872 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810891 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810910 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810927 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.810948 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811013 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811034 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811055 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811075 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811095 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811114 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811133 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811152 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811171 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811193 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811213 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811230 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811252 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811277 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811298 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811318 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811336 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811355 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811373 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811392 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811411 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811433 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811452 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811470 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811494 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811514 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811535 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811556 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811576 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811596 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811616 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811637 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811663 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811686 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811709 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811733 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811755 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811778 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811800 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811822 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811849 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811869 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811889 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811909 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811928 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.811952 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812046 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812067 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812088 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812108 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812128 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812150 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812172 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812192 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812211 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812231 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812252 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812272 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812292 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812312 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812333 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812353 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812373 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812393 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812414 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812434 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812454 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812474 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812496 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812517 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812538 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812559 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812582 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812603 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.812624 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814285 4792 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814331 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814354 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814375 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814396 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814417 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814437 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814456 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814481 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814501 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814524 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814544 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814566 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814589 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814608 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814628 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814650 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814671 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814692 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814714 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814735 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814757 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814778 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814797 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814818 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814839 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814860 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814881 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814902 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814926 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814946 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.814995 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815017 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815039 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815058 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815078 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815097 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815116 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815137 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815157 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815175 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815194 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815212 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815233 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815253 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815285 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815304 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815324 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815343 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815362 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815380 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815400 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815418 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815437 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815455 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815473 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815492 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815529 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815548 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815567 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815590 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815611 4792 reconstruct.go:97] "Volume reconstruction finished" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.815625 4792 reconciler.go:26] "Reconciler: start to sync state" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.827306 4792 manager.go:324] Recovery completed Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.843863 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.846104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.846285 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.846394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.847274 4792 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.847299 4792 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.847325 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.850842 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.852918 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.852959 4792 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.853007 4792 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.853061 4792 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 15:34:11 crc kubenswrapper[4792]: W0318 15:34:11.853889 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.854011 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.872813 4792 policy_none.go:49] "None policy: Start" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.873793 4792 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.873851 4792 state_mem.go:35] "Initializing new in-memory state store" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.888482 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.926376 4792 manager.go:334] "Starting Device Plugin manager" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.926438 4792 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.926455 4792 server.go:79] "Starting device plugin registration server" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.926920 4792 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.927014 4792 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.927200 4792 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.927331 4792 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.927346 4792 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.932677 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.953251 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.953426 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.954804 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.954842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.954852 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.955068 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.955321 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.955407 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956582 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956772 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956831 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956871 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.956890 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.958564 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.958662 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.958687 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.959078 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.959099 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.959109 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.959208 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.959358 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.959404 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960270 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960503 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960535 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.960659 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.961765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.961803 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.961818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.962128 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.962170 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.962186 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.962354 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.962388 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.963177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.963204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:11 crc kubenswrapper[4792]: I0318 15:34:11.963216 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:11 crc kubenswrapper[4792]: E0318 15:34:11.994634 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.021469 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.021541 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.021580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.021617 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.021706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.021784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022091 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022192 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022294 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022355 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.022416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.027142 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.032447 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.032521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.032537 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.032572 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:12 crc kubenswrapper[4792]: E0318 15:34:12.033389 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.123697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.123776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.123811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.123844 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.123878 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.123911 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.123941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124037 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124114 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124161 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124202 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124277 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124057 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124193 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124283 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124299 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124447 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124425 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.124491 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.233560 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.234796 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.234837 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.234847 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.234871 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:12 crc kubenswrapper[4792]: E0318 15:34:12.235216 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.294541 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.302773 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.322598 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.343203 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: W0318 15:34:12.346607 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a27663432820d26d4c59484eb989b26b9a8ecc4e5da7cecf4202dd281930e8bf WatchSource:0}: Error finding container a27663432820d26d4c59484eb989b26b9a8ecc4e5da7cecf4202dd281930e8bf: Status 404 returned error can't find the container with id a27663432820d26d4c59484eb989b26b9a8ecc4e5da7cecf4202dd281930e8bf Mar 18 15:34:12 crc kubenswrapper[4792]: W0318 15:34:12.347107 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5628d8909c29b669804530272e4f9ab974135d1ed8cd711b9b11ca579a2f5582 WatchSource:0}: Error finding container 5628d8909c29b669804530272e4f9ab974135d1ed8cd711b9b11ca579a2f5582: Status 404 returned error can't find the container with id 5628d8909c29b669804530272e4f9ab974135d1ed8cd711b9b11ca579a2f5582 Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.351498 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 15:34:12 crc kubenswrapper[4792]: W0318 15:34:12.351823 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a56bb57ddde964ea60950670cf57cfd3c8dc3d6320a562f8962c78671ec8bd2c WatchSource:0}: Error finding container a56bb57ddde964ea60950670cf57cfd3c8dc3d6320a562f8962c78671ec8bd2c: Status 404 returned error can't find the container with id a56bb57ddde964ea60950670cf57cfd3c8dc3d6320a562f8962c78671ec8bd2c Mar 18 15:34:12 crc kubenswrapper[4792]: W0318 15:34:12.360176 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-77292481f99750f0a70d4c287de6dc0852311bd17cc1e3e37d6fae5d7325847b WatchSource:0}: Error finding container 77292481f99750f0a70d4c287de6dc0852311bd17cc1e3e37d6fae5d7325847b: Status 404 returned error can't find the container with id 77292481f99750f0a70d4c287de6dc0852311bd17cc1e3e37d6fae5d7325847b Mar 18 15:34:12 crc kubenswrapper[4792]: W0318 15:34:12.371258 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d00cb925052cf13d610ff43379d1bece9b9e99e77f7d8d98c785d1522f7ef754 WatchSource:0}: Error finding container d00cb925052cf13d610ff43379d1bece9b9e99e77f7d8d98c785d1522f7ef754: Status 404 returned error can't find the container with id d00cb925052cf13d610ff43379d1bece9b9e99e77f7d8d98c785d1522f7ef754 Mar 18 15:34:12 crc kubenswrapper[4792]: E0318 15:34:12.396231 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.635762 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.636938 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.637043 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.637061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.637097 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:12 crc kubenswrapper[4792]: E0318 15:34:12.637617 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.783251 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:12 crc kubenswrapper[4792]: W0318 15:34:12.818214 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:12 crc kubenswrapper[4792]: E0318 15:34:12.818354 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.857846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a56bb57ddde964ea60950670cf57cfd3c8dc3d6320a562f8962c78671ec8bd2c"} Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.858741 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a27663432820d26d4c59484eb989b26b9a8ecc4e5da7cecf4202dd281930e8bf"} Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.859663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5628d8909c29b669804530272e4f9ab974135d1ed8cd711b9b11ca579a2f5582"} Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.860854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d00cb925052cf13d610ff43379d1bece9b9e99e77f7d8d98c785d1522f7ef754"} Mar 18 15:34:12 crc kubenswrapper[4792]: I0318 15:34:12.861705 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"77292481f99750f0a70d4c287de6dc0852311bd17cc1e3e37d6fae5d7325847b"} Mar 18 15:34:13 crc kubenswrapper[4792]: W0318 15:34:13.005708 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:13 crc kubenswrapper[4792]: E0318 15:34:13.006160 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:13 crc kubenswrapper[4792]: E0318 15:34:13.197170 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 18 15:34:13 crc kubenswrapper[4792]: W0318 15:34:13.317736 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:13 crc kubenswrapper[4792]: E0318 15:34:13.317869 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:13 crc kubenswrapper[4792]: W0318 15:34:13.375555 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:13 crc kubenswrapper[4792]: E0318 15:34:13.375680 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.438337 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.441254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.441294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.441307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.441334 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:13 crc kubenswrapper[4792]: E0318 15:34:13.441895 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.768653 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:34:13 crc kubenswrapper[4792]: E0318 15:34:13.769646 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.783516 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.867088 4792 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538" exitCode=0 Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.867163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.867280 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.870340 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.870397 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.870415 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.871814 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac" exitCode=0 Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.871866 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.871935 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.873167 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.873241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.873260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.876480 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.876556 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.876567 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.876589 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.876606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.878329 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.878375 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.878393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.879353 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e" exitCode=0 Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.879436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.879570 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.880777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.880838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.880863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.881960 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610" exitCode=0 Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.882046 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610"} Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.882179 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.882873 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.883166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.883202 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.883219 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.884356 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.884402 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:13 crc kubenswrapper[4792]: I0318 15:34:13.884419 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.385945 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.392098 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.783204 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Mar 18 15:34:14 crc kubenswrapper[4792]: E0318 15:34:14.797889 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.820930 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.887728 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc9b67850f83996c3798f7c47ac639a44d27036a463155450c5be800ee75401d"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.887777 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.887790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.887800 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.887810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.887924 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.888719 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.888746 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.888757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.890824 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc" exitCode=0 Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.890887 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.890948 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.891550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.891576 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.891587 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.893542 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.893621 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.894307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.894336 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.894347 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.896805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.896853 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.896884 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.896860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.897040 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.897059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9"} Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.897727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.897749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.897758 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.898224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.898247 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:14 crc kubenswrapper[4792]: I0318 15:34:14.898262 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.042820 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.044077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.044118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.044130 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.044155 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:15 crc kubenswrapper[4792]: E0318 15:34:15.044648 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.902890 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d" exitCode=0 Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.902940 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d"} Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.903076 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.903092 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.903109 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.903198 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.903252 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.903200 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.903271 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.904632 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.904673 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.904689 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.904838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.904866 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.904877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905138 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905173 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905217 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905432 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:15 crc kubenswrapper[4792]: I0318 15:34:15.905462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.870014 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.911106 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09"} Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.911154 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd"} Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.911165 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70"} Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.911177 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5"} Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.911177 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.911219 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.911267 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.912150 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.912191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.912201 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.912858 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.912901 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:16 crc kubenswrapper[4792]: I0318 15:34:16.912922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.803166 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.896563 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.920044 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.920699 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.920051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28"} Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.920193 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.923003 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.923075 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.923094 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.923324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.923474 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:17 crc kubenswrapper[4792]: I0318 15:34:17.923610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.043103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.245047 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.246736 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.246801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.246824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.246865 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.922629 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.923597 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.924792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.924835 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.924853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.926189 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.926228 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:18 crc kubenswrapper[4792]: I0318 15:34:18.926249 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:19 crc kubenswrapper[4792]: I0318 15:34:19.785332 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:19 crc kubenswrapper[4792]: I0318 15:34:19.785693 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:19 crc kubenswrapper[4792]: I0318 15:34:19.787224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:19 crc kubenswrapper[4792]: I0318 15:34:19.787273 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:19 crc kubenswrapper[4792]: I0318 15:34:19.787284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:21 crc kubenswrapper[4792]: I0318 15:34:21.923623 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 15:34:21 crc kubenswrapper[4792]: I0318 15:34:21.923896 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:21 crc kubenswrapper[4792]: I0318 15:34:21.925282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:21 crc kubenswrapper[4792]: I0318 15:34:21.925335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:21 crc kubenswrapper[4792]: I0318 15:34:21.925354 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:21 crc kubenswrapper[4792]: E0318 15:34:21.932812 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:34:22 crc kubenswrapper[4792]: I0318 15:34:22.410519 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 15:34:22 crc kubenswrapper[4792]: I0318 15:34:22.410723 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:22 crc kubenswrapper[4792]: I0318 15:34:22.412242 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:22 crc kubenswrapper[4792]: I0318 15:34:22.412315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:22 crc kubenswrapper[4792]: I0318 15:34:22.412341 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:22 crc kubenswrapper[4792]: I0318 15:34:22.786236 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:34:22 crc kubenswrapper[4792]: I0318 15:34:22.786336 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 15:34:23 crc kubenswrapper[4792]: I0318 15:34:23.520151 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:23 crc kubenswrapper[4792]: I0318 15:34:23.520306 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:23 crc kubenswrapper[4792]: I0318 15:34:23.521633 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:23 crc kubenswrapper[4792]: I0318 15:34:23.521711 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:23 crc kubenswrapper[4792]: I0318 15:34:23.521726 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:25 crc kubenswrapper[4792]: W0318 15:34:25.522253 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.522351 4792 trace.go:236] Trace[1798875814]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 15:34:15.520) (total time: 10001ms): Mar 18 15:34:25 crc kubenswrapper[4792]: Trace[1798875814]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:34:25.522) Mar 18 15:34:25 crc kubenswrapper[4792]: Trace[1798875814]: [10.001901679s] [10.001901679s] END Mar 18 15:34:25 crc kubenswrapper[4792]: E0318 15:34:25.522375 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 15:34:25 crc kubenswrapper[4792]: W0318 15:34:25.569428 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.569601 4792 trace.go:236] Trace[1895711514]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 15:34:15.567) (total time: 10002ms): Mar 18 15:34:25 crc kubenswrapper[4792]: Trace[1895711514]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (15:34:25.569) Mar 18 15:34:25 crc kubenswrapper[4792]: Trace[1895711514]: [10.002204752s] [10.002204752s] END Mar 18 15:34:25 crc kubenswrapper[4792]: E0318 15:34:25.569660 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 15:34:25 crc kubenswrapper[4792]: W0318 15:34:25.764942 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.765516 4792 trace.go:236] Trace[603340579]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 15:34:15.763) (total time: 10002ms): Mar 18 15:34:25 crc kubenswrapper[4792]: Trace[603340579]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:34:25.764) Mar 18 15:34:25 crc kubenswrapper[4792]: Trace[603340579]: [10.002462775s] [10.002462775s] END Mar 18 15:34:25 crc kubenswrapper[4792]: E0318 15:34:25.765541 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.784438 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.940951 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.943157 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc9b67850f83996c3798f7c47ac639a44d27036a463155450c5be800ee75401d" exitCode=255 Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.943207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc9b67850f83996c3798f7c47ac639a44d27036a463155450c5be800ee75401d"} Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.943403 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.944580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.944622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.944637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:25 crc kubenswrapper[4792]: I0318 15:34:25.945325 4792 scope.go:117] "RemoveContainer" containerID="dc9b67850f83996c3798f7c47ac639a44d27036a463155450c5be800ee75401d" Mar 18 15:34:26 crc kubenswrapper[4792]: W0318 15:34:26.014583 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.014693 4792 trace.go:236] Trace[1754445178]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 15:34:16.013) (total time: 10001ms): Mar 18 15:34:26 crc kubenswrapper[4792]: Trace[1754445178]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:34:26.014) Mar 18 15:34:26 crc kubenswrapper[4792]: Trace[1754445178]: [10.001647026s] [10.001647026s] END Mar 18 15:34:26 crc kubenswrapper[4792]: E0318 15:34:26.014719 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 15:34:26 crc kubenswrapper[4792]: E0318 15:34:26.031147 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:26 crc kubenswrapper[4792]: E0318 15:34:26.031534 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:26Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 15:34:26 crc kubenswrapper[4792]: E0318 15:34:26.032642 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:34:26 crc kubenswrapper[4792]: E0318 15:34:26.035807 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:26Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df967a3f69709 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,LastTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.036888 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.036948 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.043268 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.043368 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.598003 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.786045 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:26Z is after 2026-02-23T05:33:13Z Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.949235 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.951865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c"} Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.952127 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.953350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.953399 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:26 crc kubenswrapper[4792]: I0318 15:34:26.953423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.787479 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:27Z is after 2026-02-23T05:33:13Z Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.906101 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.957409 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.958450 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.961018 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c" exitCode=255 Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.961083 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c"} Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.961135 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.961160 4792 scope.go:117] "RemoveContainer" containerID="dc9b67850f83996c3798f7c47ac639a44d27036a463155450c5be800ee75401d" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.962315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.962377 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.962401 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.963317 4792 scope.go:117] "RemoveContainer" containerID="174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c" Mar 18 15:34:27 crc kubenswrapper[4792]: E0318 15:34:27.963589 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:34:27 crc kubenswrapper[4792]: I0318 15:34:27.968577 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.043412 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.788427 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:28Z is after 2026-02-23T05:33:13Z Mar 18 15:34:28 crc kubenswrapper[4792]: W0318 15:34:28.883153 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:28Z is after 2026-02-23T05:33:13Z Mar 18 15:34:28 crc kubenswrapper[4792]: E0318 15:34:28.883234 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.966498 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.969084 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.969940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.969996 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.970005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:28 crc kubenswrapper[4792]: I0318 15:34:28.970480 4792 scope.go:117] "RemoveContainer" containerID="174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c" Mar 18 15:34:28 crc kubenswrapper[4792]: E0318 15:34:28.970654 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:34:28 crc kubenswrapper[4792]: W0318 15:34:28.982798 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:28Z is after 2026-02-23T05:33:13Z Mar 18 15:34:28 crc kubenswrapper[4792]: E0318 15:34:28.982935 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:29 crc kubenswrapper[4792]: I0318 15:34:29.787559 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:29Z is after 2026-02-23T05:33:13Z Mar 18 15:34:29 crc kubenswrapper[4792]: I0318 15:34:29.971598 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:29 crc kubenswrapper[4792]: I0318 15:34:29.973002 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:29 crc kubenswrapper[4792]: I0318 15:34:29.973067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:29 crc kubenswrapper[4792]: I0318 15:34:29.973086 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:29 crc kubenswrapper[4792]: I0318 15:34:29.974001 4792 scope.go:117] "RemoveContainer" containerID="174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c" Mar 18 15:34:29 crc kubenswrapper[4792]: E0318 15:34:29.974306 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:34:30 crc kubenswrapper[4792]: I0318 15:34:30.787818 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:30Z is after 2026-02-23T05:33:13Z Mar 18 15:34:31 crc kubenswrapper[4792]: W0318 15:34:31.230000 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:31Z is after 2026-02-23T05:33:13Z Mar 18 15:34:31 crc kubenswrapper[4792]: E0318 15:34:31.230108 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:31 crc kubenswrapper[4792]: I0318 15:34:31.789242 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:31Z is after 2026-02-23T05:33:13Z Mar 18 15:34:31 crc kubenswrapper[4792]: E0318 15:34:31.932955 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:34:31 crc kubenswrapper[4792]: W0318 15:34:31.933745 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:31Z is after 2026-02-23T05:33:13Z Mar 18 15:34:31 crc kubenswrapper[4792]: E0318 15:34:31.934220 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.433141 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.434927 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.435254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.435457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.435672 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:32 crc kubenswrapper[4792]: E0318 15:34:32.437086 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:32Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 15:34:32 crc kubenswrapper[4792]: E0318 15:34:32.439837 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:32Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.446604 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.446854 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.448281 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.448322 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.448336 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.466495 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.787309 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.787803 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.789202 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:32Z is after 2026-02-23T05:33:13Z Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.981553 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.983406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.983470 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:32 crc kubenswrapper[4792]: I0318 15:34:32.983482 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:33 crc kubenswrapper[4792]: I0318 15:34:33.787449 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:33Z is after 2026-02-23T05:33:13Z Mar 18 15:34:34 crc kubenswrapper[4792]: I0318 15:34:34.597492 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:34:34 crc kubenswrapper[4792]: E0318 15:34:34.601580 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:34 crc kubenswrapper[4792]: I0318 15:34:34.786108 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:34Z is after 2026-02-23T05:33:13Z Mar 18 15:34:35 crc kubenswrapper[4792]: I0318 15:34:35.787279 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:35Z is after 2026-02-23T05:33:13Z Mar 18 15:34:36 crc kubenswrapper[4792]: E0318 15:34:36.043119 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:36Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df967a3f69709 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,LastTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:36 crc kubenswrapper[4792]: I0318 15:34:36.597504 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:36 crc kubenswrapper[4792]: I0318 15:34:36.597800 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:36 crc kubenswrapper[4792]: I0318 15:34:36.599496 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:36 crc kubenswrapper[4792]: I0318 15:34:36.599555 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:36 crc kubenswrapper[4792]: I0318 15:34:36.599570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:36 crc kubenswrapper[4792]: I0318 15:34:36.600360 4792 scope.go:117] "RemoveContainer" containerID="174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c" Mar 18 15:34:36 crc kubenswrapper[4792]: E0318 15:34:36.600539 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:34:36 crc kubenswrapper[4792]: I0318 15:34:36.785487 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:36Z is after 2026-02-23T05:33:13Z Mar 18 15:34:36 crc kubenswrapper[4792]: W0318 15:34:36.977297 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:36Z is after 2026-02-23T05:33:13Z Mar 18 15:34:36 crc kubenswrapper[4792]: E0318 15:34:36.977426 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:37 crc kubenswrapper[4792]: W0318 15:34:37.245818 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:37Z is after 2026-02-23T05:33:13Z Mar 18 15:34:37 crc kubenswrapper[4792]: E0318 15:34:37.246217 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:37 crc kubenswrapper[4792]: I0318 15:34:37.786963 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:37Z is after 2026-02-23T05:33:13Z Mar 18 15:34:38 crc kubenswrapper[4792]: I0318 15:34:38.788163 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:38Z is after 2026-02-23T05:33:13Z Mar 18 15:34:39 crc kubenswrapper[4792]: I0318 15:34:39.440428 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:39 crc kubenswrapper[4792]: E0318 15:34:39.441827 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:39Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 15:34:39 crc kubenswrapper[4792]: I0318 15:34:39.441961 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:39 crc kubenswrapper[4792]: I0318 15:34:39.442078 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:39 crc kubenswrapper[4792]: I0318 15:34:39.442107 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:39 crc kubenswrapper[4792]: I0318 15:34:39.442166 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:39 crc kubenswrapper[4792]: E0318 15:34:39.446648 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:39Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:34:39 crc kubenswrapper[4792]: I0318 15:34:39.786206 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:39Z is after 2026-02-23T05:33:13Z Mar 18 15:34:40 crc kubenswrapper[4792]: I0318 15:34:40.787720 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:40Z is after 2026-02-23T05:33:13Z Mar 18 15:34:41 crc kubenswrapper[4792]: I0318 15:34:41.786941 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:41Z is after 2026-02-23T05:33:13Z Mar 18 15:34:41 crc kubenswrapper[4792]: W0318 15:34:41.807028 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:41Z is after 2026-02-23T05:33:13Z Mar 18 15:34:41 crc kubenswrapper[4792]: E0318 15:34:41.807136 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:41 crc kubenswrapper[4792]: E0318 15:34:41.933958 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:34:42 crc kubenswrapper[4792]: W0318 15:34:42.175915 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z Mar 18 15:34:42 crc kubenswrapper[4792]: E0318 15:34:42.176069 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.787154 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.787259 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.787362 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.787599 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.789529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.789598 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.789622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.789524 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.790487 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 15:34:42 crc kubenswrapper[4792]: I0318 15:34:42.790821 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d" gracePeriod=30 Mar 18 15:34:43 crc kubenswrapper[4792]: I0318 15:34:43.011151 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:34:43 crc kubenswrapper[4792]: I0318 15:34:43.011881 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d" exitCode=255 Mar 18 15:34:43 crc kubenswrapper[4792]: I0318 15:34:43.011920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d"} Mar 18 15:34:43 crc kubenswrapper[4792]: I0318 15:34:43.786158 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:43Z is after 2026-02-23T05:33:13Z Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.018964 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.019606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda"} Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.019729 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.021124 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.021190 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.021215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.786878 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:44Z is after 2026-02-23T05:33:13Z Mar 18 15:34:44 crc kubenswrapper[4792]: I0318 15:34:44.821343 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:45 crc kubenswrapper[4792]: I0318 15:34:45.021519 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:45 crc kubenswrapper[4792]: I0318 15:34:45.024012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:45 crc kubenswrapper[4792]: I0318 15:34:45.024058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:45 crc kubenswrapper[4792]: I0318 15:34:45.024073 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:45 crc kubenswrapper[4792]: I0318 15:34:45.787469 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:45Z is after 2026-02-23T05:33:13Z Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.025092 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.026617 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.026877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.027143 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:46 crc kubenswrapper[4792]: E0318 15:34:46.049585 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:46Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df967a3f69709 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,LastTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.447478 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.449418 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.449505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.449537 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.449584 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:46 crc kubenswrapper[4792]: E0318 15:34:46.450705 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:46Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 15:34:46 crc kubenswrapper[4792]: E0318 15:34:46.456985 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 15:34:46 crc kubenswrapper[4792]: I0318 15:34:46.785930 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:46Z is after 2026-02-23T05:33:13Z Mar 18 15:34:47 crc kubenswrapper[4792]: I0318 15:34:47.790652 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:48 crc kubenswrapper[4792]: I0318 15:34:48.791010 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:49 crc kubenswrapper[4792]: I0318 15:34:49.786049 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:49 crc kubenswrapper[4792]: I0318 15:34:49.786247 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:49 crc kubenswrapper[4792]: I0318 15:34:49.787549 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:49 crc kubenswrapper[4792]: I0318 15:34:49.787602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:49 crc kubenswrapper[4792]: I0318 15:34:49.787616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:49 crc kubenswrapper[4792]: I0318 15:34:49.790416 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:50 crc kubenswrapper[4792]: I0318 15:34:50.789695 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:50 crc kubenswrapper[4792]: I0318 15:34:50.853481 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:50 crc kubenswrapper[4792]: I0318 15:34:50.855513 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:50 crc kubenswrapper[4792]: I0318 15:34:50.855590 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:50 crc kubenswrapper[4792]: I0318 15:34:50.855616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:50 crc kubenswrapper[4792]: I0318 15:34:50.856671 4792 scope.go:117] "RemoveContainer" containerID="174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c" Mar 18 15:34:51 crc kubenswrapper[4792]: I0318 15:34:51.167694 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 15:34:51 crc kubenswrapper[4792]: I0318 15:34:51.185291 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 15:34:51 crc kubenswrapper[4792]: I0318 15:34:51.789504 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:51 crc kubenswrapper[4792]: E0318 15:34:51.934581 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.042887 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.043372 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.045104 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab" exitCode=255 Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.045141 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab"} Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.045177 4792 scope.go:117] "RemoveContainer" containerID="174cd6f856688565e150c68af23d3316ec1de27584799997f87506addc2bbf6c" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.045350 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.046067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.046095 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.046104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.046525 4792 scope.go:117] "RemoveContainer" containerID="7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab" Mar 18 15:34:52 crc kubenswrapper[4792]: E0318 15:34:52.046666 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.786509 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.786615 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:34:52 crc kubenswrapper[4792]: I0318 15:34:52.789667 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:53 crc kubenswrapper[4792]: I0318 15:34:53.050445 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:34:53 crc kubenswrapper[4792]: W0318 15:34:53.296435 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 15:34:53 crc kubenswrapper[4792]: E0318 15:34:53.296485 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 15:34:53 crc kubenswrapper[4792]: I0318 15:34:53.457337 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:53 crc kubenswrapper[4792]: E0318 15:34:53.459089 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:34:53 crc kubenswrapper[4792]: I0318 15:34:53.459098 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:53 crc kubenswrapper[4792]: I0318 15:34:53.459188 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:53 crc kubenswrapper[4792]: I0318 15:34:53.459221 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:53 crc kubenswrapper[4792]: I0318 15:34:53.459271 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:34:53 crc kubenswrapper[4792]: E0318 15:34:53.467046 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:34:53 crc kubenswrapper[4792]: I0318 15:34:53.788861 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:54 crc kubenswrapper[4792]: I0318 15:34:54.790890 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:55 crc kubenswrapper[4792]: I0318 15:34:55.833325 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.058686 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a3f69709 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,LastTimestamp:2026-03-18 15:34:11.781637897 +0000 UTC m=+0.650966864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.065025 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.073037 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.078122 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d42481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,LastTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.085243 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967accb649c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.929801884 +0000 UTC m=+0.799130821,LastTimestamp:2026-03-18 15:34:11.929801884 +0000 UTC m=+0.799130821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.089602 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d096c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.954833064 +0000 UTC m=+0.824162001,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.092503 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d2845b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.954848694 +0000 UTC m=+0.824177631,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.094736 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d42481\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d42481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,LastTimestamp:2026-03-18 15:34:11.954858665 +0000 UTC m=+0.824187602,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.096826 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d096c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.956371922 +0000 UTC m=+0.825700869,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.101827 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d2845b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.956397832 +0000 UTC m=+0.825726779,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.106889 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d42481\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d42481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,LastTimestamp:2026-03-18 15:34:11.956410403 +0000 UTC m=+0.825739350,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.111815 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d096c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.95685538 +0000 UTC m=+0.826184337,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.117140 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d2845b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.956880071 +0000 UTC m=+0.826209018,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.123935 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d42481\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d42481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,LastTimestamp:2026-03-18 15:34:11.956897851 +0000 UTC m=+0.826226808,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.130636 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d096c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.958636225 +0000 UTC m=+0.827965182,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.136893 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d2845b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.958681367 +0000 UTC m=+0.828010314,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.142051 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d42481\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d42481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,LastTimestamp:2026-03-18 15:34:11.95874439 +0000 UTC m=+0.828073337,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.147155 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d096c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.959093673 +0000 UTC m=+0.828422610,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.151909 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d2845b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.959105673 +0000 UTC m=+0.828434610,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.157443 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d42481\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d42481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,LastTimestamp:2026-03-18 15:34:11.959113873 +0000 UTC m=+0.828442810,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.162370 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d096c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.960296527 +0000 UTC m=+0.829625484,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.167506 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d2845b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.960326328 +0000 UTC m=+0.829655285,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.172879 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d42481\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d42481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846489217 +0000 UTC m=+0.715818164,LastTimestamp:2026-03-18 15:34:11.960341128 +0000 UTC m=+0.829670085,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.180290 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d096c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d096c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846256329 +0000 UTC m=+0.715585286,LastTimestamp:2026-03-18 15:34:11.960500864 +0000 UTC m=+0.829829821,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.187186 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df967a7d2845b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df967a7d2845b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:11.846382683 +0000 UTC m=+0.715711640,LastTimestamp:2026-03-18 15:34:11.960546436 +0000 UTC m=+0.829875383,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.195886 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df967c613175d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.353931101 +0000 UTC m=+1.223260048,LastTimestamp:2026-03-18 15:34:12.353931101 +0000 UTC m=+1.223260048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.202253 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df967c6138252 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.353958482 +0000 UTC m=+1.223287469,LastTimestamp:2026-03-18 15:34:12.353958482 +0000 UTC m=+1.223287469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.209285 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967c6149189 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.354027913 +0000 UTC m=+1.223356900,LastTimestamp:2026-03-18 15:34:12.354027913 +0000 UTC m=+1.223356900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.216416 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df967c6b647f3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.364625907 +0000 UTC m=+1.233954854,LastTimestamp:2026-03-18 15:34:12.364625907 +0000 UTC m=+1.233954854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.221320 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df967c7a36218 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.380164632 +0000 UTC m=+1.249493569,LastTimestamp:2026-03-18 15:34:12.380164632 +0000 UTC m=+1.249493569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.229859 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967ea3acfef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.960514031 +0000 UTC m=+1.829842958,LastTimestamp:2026-03-18 15:34:12.960514031 +0000 UTC m=+1.829842958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.237310 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df967ea94ef40 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.966420288 +0000 UTC m=+1.835749225,LastTimestamp:2026-03-18 15:34:12.966420288 +0000 UTC m=+1.835749225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.242698 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df967ead168ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.970383534 +0000 UTC m=+1.839712471,LastTimestamp:2026-03-18 15:34:12.970383534 +0000 UTC m=+1.839712471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.249391 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967eb257ae8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.975893224 +0000 UTC m=+1.845222161,LastTimestamp:2026-03-18 15:34:12.975893224 +0000 UTC m=+1.845222161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.255305 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df967eb39ffe2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.977237986 +0000 UTC m=+1.846566923,LastTimestamp:2026-03-18 15:34:12.977237986 +0000 UTC m=+1.846566923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.260602 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967eb3b6171 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.977328497 +0000 UTC m=+1.846657454,LastTimestamp:2026-03-18 15:34:12.977328497 +0000 UTC m=+1.846657454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.267845 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df967eb3c5e6b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.977393259 +0000 UTC m=+1.846722196,LastTimestamp:2026-03-18 15:34:12.977393259 +0000 UTC m=+1.846722196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.275229 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df967eb447c2f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.977925167 +0000 UTC m=+1.847254104,LastTimestamp:2026-03-18 15:34:12.977925167 +0000 UTC m=+1.847254104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.282344 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df967ebb354e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.985189607 +0000 UTC m=+1.854518544,LastTimestamp:2026-03-18 15:34:12.985189607 +0000 UTC m=+1.854518544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.289549 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df967ec139905 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.991498501 +0000 UTC m=+1.860827438,LastTimestamp:2026-03-18 15:34:12.991498501 +0000 UTC m=+1.860827438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.298691 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df967ec29cf48 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.992954184 +0000 UTC m=+1.862283121,LastTimestamp:2026-03-18 15:34:12.992954184 +0000 UTC m=+1.862283121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.305625 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967fd099b0c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.276056332 +0000 UTC m=+2.145385299,LastTimestamp:2026-03-18 15:34:13.276056332 +0000 UTC m=+2.145385299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.311556 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967fd8f5e1c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.284822556 +0000 UTC m=+2.154151493,LastTimestamp:2026-03-18 15:34:13.284822556 +0000 UTC m=+2.154151493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.318313 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967fda223e5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.286052837 +0000 UTC m=+2.155381814,LastTimestamp:2026-03-18 15:34:13.286052837 +0000 UTC m=+2.155381814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.325836 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96808bddb7b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.472418683 +0000 UTC m=+2.341747620,LastTimestamp:2026-03-18 15:34:13.472418683 +0000 UTC m=+2.341747620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.333049 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96809c15359 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.489423193 +0000 UTC m=+2.358752130,LastTimestamp:2026-03-18 15:34:13.489423193 +0000 UTC m=+2.358752130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.338520 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96809d82bba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.490920378 +0000 UTC m=+2.360249315,LastTimestamp:2026-03-18 15:34:13.490920378 +0000 UTC m=+2.360249315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.343885 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df968170921af openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.712232879 +0000 UTC m=+2.581561816,LastTimestamp:2026-03-18 15:34:13.712232879 +0000 UTC m=+2.581561816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.350330 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96817f30eb1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.727563441 +0000 UTC m=+2.596892378,LastTimestamp:2026-03-18 15:34:13.727563441 +0000 UTC m=+2.596892378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.357461 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df9682096b657 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.872506455 +0000 UTC m=+2.741835432,LastTimestamp:2026-03-18 15:34:13.872506455 +0000 UTC m=+2.741835432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.364915 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df96820b904fc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.874754812 +0000 UTC m=+2.744083749,LastTimestamp:2026-03-18 15:34:13.874754812 +0000 UTC m=+2.744083749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.370206 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9682132f322 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.882745634 +0000 UTC m=+2.752074571,LastTimestamp:2026-03-18 15:34:13.882745634 +0000 UTC m=+2.752074571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.377252 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968219055ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.888865774 +0000 UTC m=+2.758194721,LastTimestamp:2026-03-18 15:34:13.888865774 +0000 UTC m=+2.758194721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.382499 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df9682eb28753 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.109210451 +0000 UTC m=+2.978539378,LastTimestamp:2026-03-18 15:34:14.109210451 +0000 UTC m=+2.978539378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.391357 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9682eb4f512 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.109369618 +0000 UTC m=+2.978698555,LastTimestamp:2026-03-18 15:34:14.109369618 +0000 UTC m=+2.978698555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.399109 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9682ee9af03 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.112825091 +0000 UTC m=+2.982154028,LastTimestamp:2026-03-18 15:34:14.112825091 +0000 UTC m=+2.982154028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.404117 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9682f468422 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.118908962 +0000 UTC m=+2.988237899,LastTimestamp:2026-03-18 15:34:14.118908962 +0000 UTC m=+2.988237899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.408745 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9682f5b591b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.120274203 +0000 UTC m=+2.989603140,LastTimestamp:2026-03-18 15:34:14.120274203 +0000 UTC m=+2.989603140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.415691 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df9682f82431d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.122824477 +0000 UTC m=+2.992153414,LastTimestamp:2026-03-18 15:34:14.122824477 +0000 UTC m=+2.992153414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.421831 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9682ff55efc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.130368252 +0000 UTC m=+2.999697189,LastTimestamp:2026-03-18 15:34:14.130368252 +0000 UTC m=+2.999697189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.428312 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9683000e082 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.131122306 +0000 UTC m=+3.000451243,LastTimestamp:2026-03-18 15:34:14.131122306 +0000 UTC m=+3.000451243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.434324 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df96830414a9d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.135343773 +0000 UTC m=+3.004672720,LastTimestamp:2026-03-18 15:34:14.135343773 +0000 UTC m=+3.004672720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.440619 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9683292b485 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.174233733 +0000 UTC m=+3.043562670,LastTimestamp:2026-03-18 15:34:14.174233733 +0000 UTC m=+3.043562670,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.446581 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9683b003094 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.315626644 +0000 UTC m=+3.184955581,LastTimestamp:2026-03-18 15:34:14.315626644 +0000 UTC m=+3.184955581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.452276 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9683b210536 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.31777823 +0000 UTC m=+3.187107187,LastTimestamp:2026-03-18 15:34:14.31777823 +0000 UTC m=+3.187107187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.456936 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9683bdaf826 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.329964582 +0000 UTC m=+3.199293539,LastTimestamp:2026-03-18 15:34:14.329964582 +0000 UTC m=+3.199293539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.463473 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9683beb058c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.331016588 +0000 UTC m=+3.200345545,LastTimestamp:2026-03-18 15:34:14.331016588 +0000 UTC m=+3.200345545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.469084 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9683c07b4bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.332896443 +0000 UTC m=+3.202225390,LastTimestamp:2026-03-18 15:34:14.332896443 +0000 UTC m=+3.202225390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.476129 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9683c16b86a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.333880426 +0000 UTC m=+3.203209363,LastTimestamp:2026-03-18 15:34:14.333880426 +0000 UTC m=+3.203209363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.482587 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df9684596ebc8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.493277128 +0000 UTC m=+3.362606065,LastTimestamp:2026-03-18 15:34:14.493277128 +0000 UTC m=+3.362606065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.488853 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df96845ade22a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.494781994 +0000 UTC m=+3.364110931,LastTimestamp:2026-03-18 15:34:14.494781994 +0000 UTC m=+3.364110931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.495509 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df968464e5ca8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.505299112 +0000 UTC m=+3.374628049,LastTimestamp:2026-03-18 15:34:14.505299112 +0000 UTC m=+3.374628049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.503244 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9684678ef12 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.508089106 +0000 UTC m=+3.377418033,LastTimestamp:2026-03-18 15:34:14.508089106 +0000 UTC m=+3.377418033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.505555 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df968468785f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.509045239 +0000 UTC m=+3.378374176,LastTimestamp:2026-03-18 15:34:14.509045239 +0000 UTC m=+3.378374176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.508547 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df968504b1858 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.672857176 +0000 UTC m=+3.542186113,LastTimestamp:2026-03-18 15:34:14.672857176 +0000 UTC m=+3.542186113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.513854 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df96851007111 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.684741905 +0000 UTC m=+3.554070842,LastTimestamp:2026-03-18 15:34:14.684741905 +0000 UTC m=+3.554070842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.520232 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df968511561ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.686114287 +0000 UTC m=+3.555443244,LastTimestamp:2026-03-18 15:34:14.686114287 +0000 UTC m=+3.555443244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.525046 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9685adfdc48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.850378824 +0000 UTC m=+3.719707761,LastTimestamp:2026-03-18 15:34:14.850378824 +0000 UTC m=+3.719707761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.529450 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9685b727650 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.859986512 +0000 UTC m=+3.729315449,LastTimestamp:2026-03-18 15:34:14.859986512 +0000 UTC m=+3.729315449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.536355 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df9685d639263 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.892565091 +0000 UTC m=+3.761894028,LastTimestamp:2026-03-18 15:34:14.892565091 +0000 UTC m=+3.761894028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.540621 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968686edfb2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:15.077855154 +0000 UTC m=+3.947184091,LastTimestamp:2026-03-18 15:34:15.077855154 +0000 UTC m=+3.947184091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.544292 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968690bc865 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:15.088138341 +0000 UTC m=+3.957467288,LastTimestamp:2026-03-18 15:34:15.088138341 +0000 UTC m=+3.957467288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.549409 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df96899d7fade openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:15.906826974 +0000 UTC m=+4.776155951,LastTimestamp:2026-03-18 15:34:15.906826974 +0000 UTC m=+4.776155951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.553425 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968a81c109b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.146170011 +0000 UTC m=+5.015498948,LastTimestamp:2026-03-18 15:34:16.146170011 +0000 UTC m=+5.015498948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.557495 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968a8e396da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.159246042 +0000 UTC m=+5.028575019,LastTimestamp:2026-03-18 15:34:16.159246042 +0000 UTC m=+5.028575019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.561771 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968a8f2dfff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.160247807 +0000 UTC m=+5.029576784,LastTimestamp:2026-03-18 15:34:16.160247807 +0000 UTC m=+5.029576784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.566337 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968b703347b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.396199035 +0000 UTC m=+5.265528012,LastTimestamp:2026-03-18 15:34:16.396199035 +0000 UTC m=+5.265528012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.570225 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968b7fc3350 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.4125172 +0000 UTC m=+5.281846177,LastTimestamp:2026-03-18 15:34:16.4125172 +0000 UTC m=+5.281846177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.573439 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968b81523f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.414151673 +0000 UTC m=+5.283480650,LastTimestamp:2026-03-18 15:34:16.414151673 +0000 UTC m=+5.283480650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.579414 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968c6159e6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.649064043 +0000 UTC m=+5.518393010,LastTimestamp:2026-03-18 15:34:16.649064043 +0000 UTC m=+5.518393010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.583064 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968c6a4442e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.65841259 +0000 UTC m=+5.527741537,LastTimestamp:2026-03-18 15:34:16.65841259 +0000 UTC m=+5.527741537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.587315 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968c6b59bd5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.659549141 +0000 UTC m=+5.528878088,LastTimestamp:2026-03-18 15:34:16.659549141 +0000 UTC m=+5.528878088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.591623 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968d2db963c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.863364668 +0000 UTC m=+5.732693605,LastTimestamp:2026-03-18 15:34:16.863364668 +0000 UTC m=+5.732693605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: I0318 15:34:56.596996 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:56 crc kubenswrapper[4792]: I0318 15:34:56.597267 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:56 crc kubenswrapper[4792]: I0318 15:34:56.598280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:56 crc kubenswrapper[4792]: I0318 15:34:56.598305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:56 crc kubenswrapper[4792]: I0318 15:34:56.598316 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:56 crc kubenswrapper[4792]: I0318 15:34:56.598853 4792 scope.go:117] "RemoveContainer" containerID="7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.599088 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.599118 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968d410e3ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.88363513 +0000 UTC m=+5.752964087,LastTimestamp:2026-03-18 15:34:16.88363513 +0000 UTC m=+5.752964087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.604007 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968d4295ec3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:16.885239491 +0000 UTC m=+5.754568438,LastTimestamp:2026-03-18 15:34:16.885239491 +0000 UTC m=+5.754568438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.607785 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968e03b3bf4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:17.08773682 +0000 UTC m=+5.957065757,LastTimestamp:2026-03-18 15:34:17.08773682 +0000 UTC m=+5.957065757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.609809 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df968e0ebe3af openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:17.099314095 +0000 UTC m=+5.968643062,LastTimestamp:2026-03-18 15:34:17.099314095 +0000 UTC m=+5.968643062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.613368 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:34:56 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189df96a33e48520 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 15:34:56 crc kubenswrapper[4792]: body: Mar 18 15:34:56 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:22.78630736 +0000 UTC m=+11.655636317,LastTimestamp:2026-03-18 15:34:22.78630736 +0000 UTC m=+11.655636317,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:34:56 crc kubenswrapper[4792]: > Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.617729 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96a33e5b1c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:22.786384323 +0000 UTC m=+11.655713300,LastTimestamp:2026-03-18 15:34:22.786384323 +0000 UTC m=+11.655713300,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.626018 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df968511561ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df968511561ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.686114287 +0000 UTC m=+3.555443244,LastTimestamp:2026-03-18 15:34:25.946856771 +0000 UTC m=+14.816185708,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.629736 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 15:34:56 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.189df96af5a5214e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 15:34:56 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 15:34:56 crc kubenswrapper[4792]: Mar 18 15:34:56 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:26.036932942 +0000 UTC m=+14.906261879,LastTimestamp:2026-03-18 15:34:26.036932942 +0000 UTC m=+14.906261879,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:34:56 crc kubenswrapper[4792]: > Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.637150 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df96af5a5fe18 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:26.036989464 +0000 UTC m=+14.906318401,LastTimestamp:2026-03-18 15:34:26.036989464 +0000 UTC m=+14.906318401,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.640589 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df96af5a5214e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 15:34:56 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.189df96af5a5214e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 15:34:56 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 15:34:56 crc kubenswrapper[4792]: Mar 18 15:34:56 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:26.036932942 +0000 UTC m=+14.906261879,LastTimestamp:2026-03-18 15:34:26.043340582 +0000 UTC m=+14.912669519,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:34:56 crc kubenswrapper[4792]: > Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.644840 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df96af5a5fe18\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df96af5a5fe18 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:26.036989464 +0000 UTC m=+14.906318401,LastTimestamp:2026-03-18 15:34:26.043395114 +0000 UTC m=+14.912724051,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.649766 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df9685adfdc48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9685adfdc48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.850378824 +0000 UTC m=+3.719707761,LastTimestamp:2026-03-18 15:34:26.157408498 +0000 UTC m=+15.026737435,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.653896 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df9685b727650\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df9685b727650 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:14.859986512 +0000 UTC m=+3.729315449,LastTimestamp:2026-03-18 15:34:26.175625119 +0000 UTC m=+15.044954066,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.659203 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:34:56 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189df96c88069aca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:34:56 crc kubenswrapper[4792]: body: Mar 18 15:34:56 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:32.787761866 +0000 UTC m=+21.657090833,LastTimestamp:2026-03-18 15:34:32.787761866 +0000 UTC m=+21.657090833,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:34:56 crc kubenswrapper[4792]: > Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.668155 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96c8809ab28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:32.787962664 +0000 UTC m=+21.657291641,LastTimestamp:2026-03-18 15:34:32.787962664 +0000 UTC m=+21.657291641,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.674896 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df96c88069aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:34:56 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189df96c88069aca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:34:56 crc kubenswrapper[4792]: body: Mar 18 15:34:56 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:32.787761866 +0000 UTC m=+21.657090833,LastTimestamp:2026-03-18 15:34:42.787229344 +0000 UTC m=+31.656558331,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:34:56 crc kubenswrapper[4792]: > Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.679928 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df96c8809ab28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96c8809ab28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:32.787962664 +0000 UTC m=+21.657291641,LastTimestamp:2026-03-18 15:34:42.787313868 +0000 UTC m=+31.656642855,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.684305 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96edc409c2b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:42.790784043 +0000 UTC m=+31.660113030,LastTimestamp:2026-03-18 15:34:42.790784043 +0000 UTC m=+31.660113030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.688210 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df967eb3b6171\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967eb3b6171 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:12.977328497 +0000 UTC m=+1.846657454,LastTimestamp:2026-03-18 15:34:42.911692016 +0000 UTC m=+31.781020993,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.692997 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df967fd099b0c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967fd099b0c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.276056332 +0000 UTC m=+2.145385299,LastTimestamp:2026-03-18 15:34:43.153857216 +0000 UTC m=+32.023186193,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.696734 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df967fd8f5e1c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df967fd8f5e1c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:13.284822556 +0000 UTC m=+2.154151493,LastTimestamp:2026-03-18 15:34:43.166705287 +0000 UTC m=+32.036034264,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.702112 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df96c88069aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 15:34:56 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189df96c88069aca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 15:34:56 crc kubenswrapper[4792]: body: Mar 18 15:34:56 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:32.787761866 +0000 UTC m=+21.657090833,LastTimestamp:2026-03-18 15:34:52.786591178 +0000 UTC m=+41.655920145,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 15:34:56 crc kubenswrapper[4792]: > Mar 18 15:34:56 crc kubenswrapper[4792]: E0318 15:34:56.705330 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df96c8809ab28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df96c8809ab28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:34:32.787962664 +0000 UTC m=+21.657291641,LastTimestamp:2026-03-18 15:34:52.786652831 +0000 UTC m=+41.655981808,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:34:56 crc kubenswrapper[4792]: I0318 15:34:56.788029 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:57 crc kubenswrapper[4792]: I0318 15:34:57.788920 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:58 crc kubenswrapper[4792]: I0318 15:34:58.043442 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:34:58 crc kubenswrapper[4792]: I0318 15:34:58.043999 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:58 crc kubenswrapper[4792]: I0318 15:34:58.045335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:58 crc kubenswrapper[4792]: I0318 15:34:58.045408 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:58 crc kubenswrapper[4792]: I0318 15:34:58.045427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:58 crc kubenswrapper[4792]: I0318 15:34:58.046536 4792 scope.go:117] "RemoveContainer" containerID="7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab" Mar 18 15:34:58 crc kubenswrapper[4792]: E0318 15:34:58.046826 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:34:58 crc kubenswrapper[4792]: I0318 15:34:58.787822 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:59 crc kubenswrapper[4792]: I0318 15:34:59.792896 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:34:59 crc kubenswrapper[4792]: I0318 15:34:59.793075 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:34:59 crc kubenswrapper[4792]: I0318 15:34:59.794014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:34:59 crc kubenswrapper[4792]: I0318 15:34:59.794055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:34:59 crc kubenswrapper[4792]: I0318 15:34:59.794065 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:34:59 crc kubenswrapper[4792]: I0318 15:34:59.794751 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:34:59 crc kubenswrapper[4792]: I0318 15:34:59.797291 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:35:00 crc kubenswrapper[4792]: W0318 15:35:00.060203 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 15:35:00 crc kubenswrapper[4792]: E0318 15:35:00.060264 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.069330 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.070319 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.070355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.070367 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:00 crc kubenswrapper[4792]: E0318 15:35:00.463673 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.467783 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.469028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.469064 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.469077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.469101 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:00 crc kubenswrapper[4792]: E0318 15:35:00.473027 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:35:00 crc kubenswrapper[4792]: I0318 15:35:00.787988 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:00 crc kubenswrapper[4792]: W0318 15:35:00.921828 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:00 crc kubenswrapper[4792]: E0318 15:35:00.921909 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 15:35:01 crc kubenswrapper[4792]: I0318 15:35:01.787928 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:01 crc kubenswrapper[4792]: E0318 15:35:01.934881 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:35:02 crc kubenswrapper[4792]: I0318 15:35:02.789654 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:03 crc kubenswrapper[4792]: I0318 15:35:03.731606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 15:35:03 crc kubenswrapper[4792]: I0318 15:35:03.731777 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:03 crc kubenswrapper[4792]: I0318 15:35:03.733148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:03 crc kubenswrapper[4792]: I0318 15:35:03.733200 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:03 crc kubenswrapper[4792]: I0318 15:35:03.733214 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:03 crc kubenswrapper[4792]: I0318 15:35:03.787760 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:04 crc kubenswrapper[4792]: I0318 15:35:04.789592 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:05 crc kubenswrapper[4792]: I0318 15:35:05.786637 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:06 crc kubenswrapper[4792]: W0318 15:35:06.589808 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 15:35:06 crc kubenswrapper[4792]: E0318 15:35:06.590725 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 15:35:06 crc kubenswrapper[4792]: I0318 15:35:06.790410 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:07 crc kubenswrapper[4792]: E0318 15:35:07.468628 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:35:07 crc kubenswrapper[4792]: I0318 15:35:07.473645 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:07 crc kubenswrapper[4792]: I0318 15:35:07.474894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:07 crc kubenswrapper[4792]: I0318 15:35:07.474929 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:07 crc kubenswrapper[4792]: I0318 15:35:07.474941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:07 crc kubenswrapper[4792]: I0318 15:35:07.474962 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:07 crc kubenswrapper[4792]: E0318 15:35:07.480737 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:35:07 crc kubenswrapper[4792]: I0318 15:35:07.788687 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:08 crc kubenswrapper[4792]: I0318 15:35:08.790666 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:09 crc kubenswrapper[4792]: I0318 15:35:09.789281 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:10 crc kubenswrapper[4792]: I0318 15:35:10.787077 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:11 crc kubenswrapper[4792]: I0318 15:35:11.790585 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:11 crc kubenswrapper[4792]: I0318 15:35:11.853479 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:11 crc kubenswrapper[4792]: I0318 15:35:11.855283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:11 crc kubenswrapper[4792]: I0318 15:35:11.855349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:11 crc kubenswrapper[4792]: I0318 15:35:11.855372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:11 crc kubenswrapper[4792]: I0318 15:35:11.856491 4792 scope.go:117] "RemoveContainer" containerID="7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab" Mar 18 15:35:11 crc kubenswrapper[4792]: E0318 15:35:11.856791 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:35:11 crc kubenswrapper[4792]: E0318 15:35:11.935077 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:35:12 crc kubenswrapper[4792]: I0318 15:35:12.791358 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:13 crc kubenswrapper[4792]: I0318 15:35:13.789379 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:14 crc kubenswrapper[4792]: E0318 15:35:14.478443 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 15:35:14 crc kubenswrapper[4792]: I0318 15:35:14.481488 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:14 crc kubenswrapper[4792]: I0318 15:35:14.482911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:14 crc kubenswrapper[4792]: I0318 15:35:14.482985 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:14 crc kubenswrapper[4792]: I0318 15:35:14.483001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:14 crc kubenswrapper[4792]: I0318 15:35:14.483037 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:14 crc kubenswrapper[4792]: E0318 15:35:14.487998 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 15:35:14 crc kubenswrapper[4792]: I0318 15:35:14.790079 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:15 crc kubenswrapper[4792]: I0318 15:35:15.790389 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 15:35:16 crc kubenswrapper[4792]: I0318 15:35:16.216088 4792 csr.go:261] certificate signing request csr-tz2r6 is approved, waiting to be issued Mar 18 15:35:16 crc kubenswrapper[4792]: I0318 15:35:16.224423 4792 csr.go:257] certificate signing request csr-tz2r6 is issued Mar 18 15:35:16 crc kubenswrapper[4792]: I0318 15:35:16.245217 4792 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 15:35:16 crc kubenswrapper[4792]: I0318 15:35:16.632552 4792 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 15:35:17 crc kubenswrapper[4792]: I0318 15:35:17.226260 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 14:17:52.33896466 +0000 UTC Mar 18 15:35:17 crc kubenswrapper[4792]: I0318 15:35:17.226324 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6622h42m35.112644627s for next certificate rotation Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.488776 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.490504 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.490610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.490638 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.490827 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.497366 4792 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.497785 4792 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.497826 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.501416 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.501463 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.501477 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.501495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.501508 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:21Z","lastTransitionTime":"2026-03-18T15:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.516551 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.526427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.526470 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.526490 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.526515 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.526533 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:21Z","lastTransitionTime":"2026-03-18T15:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.539143 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.550419 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.550476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.550486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.550502 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.550518 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:21Z","lastTransitionTime":"2026-03-18T15:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.565644 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.574789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.574849 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.574860 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.574878 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:21 crc kubenswrapper[4792]: I0318 15:35:21.574890 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:21Z","lastTransitionTime":"2026-03-18T15:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.591133 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.591397 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.591441 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.691731 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.792189 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.892940 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.935764 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 15:35:21 crc kubenswrapper[4792]: E0318 15:35:21.993285 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.094237 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.194990 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.296021 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.396760 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.497807 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.598197 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.699320 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.799815 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:22 crc kubenswrapper[4792]: E0318 15:35:22.900093 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.001076 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.101840 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.202098 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.303230 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.404030 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.525808 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.626411 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.726920 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.828164 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:23 crc kubenswrapper[4792]: E0318 15:35:23.928922 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.030050 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.130272 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.230923 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.331477 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.432526 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.533717 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.634136 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.734954 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.835120 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:24 crc kubenswrapper[4792]: I0318 15:35:24.854172 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:24 crc kubenswrapper[4792]: I0318 15:35:24.855692 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:24 crc kubenswrapper[4792]: I0318 15:35:24.855753 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:24 crc kubenswrapper[4792]: I0318 15:35:24.855773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:24 crc kubenswrapper[4792]: I0318 15:35:24.856620 4792 scope.go:117] "RemoveContainer" containerID="7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab" Mar 18 15:35:24 crc kubenswrapper[4792]: E0318 15:35:24.936049 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.037370 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.137991 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: I0318 15:35:25.140727 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:35:25 crc kubenswrapper[4792]: I0318 15:35:25.143888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44"} Mar 18 15:35:25 crc kubenswrapper[4792]: I0318 15:35:25.144091 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:25 crc kubenswrapper[4792]: I0318 15:35:25.147229 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:25 crc kubenswrapper[4792]: I0318 15:35:25.147263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:25 crc kubenswrapper[4792]: I0318 15:35:25.147277 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.238151 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.338502 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.439637 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.539854 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.641054 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.742095 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.842623 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:25 crc kubenswrapper[4792]: E0318 15:35:25.943078 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.043398 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.144079 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.149379 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.150001 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.152784 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" exitCode=255 Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.152826 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44"} Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.152866 4792 scope.go:117] "RemoveContainer" containerID="7f244be4b3af5dd17ab40e14249019bb5b03b40beaf54c10c2a20dfef47446ab" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.153169 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.154727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.154796 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.154824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.155892 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.156271 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.244375 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.344555 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.445313 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.546339 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: I0318 15:35:26.597658 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.646855 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.747323 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.847913 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:26 crc kubenswrapper[4792]: E0318 15:35:26.949020 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.049564 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.149750 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: I0318 15:35:27.157591 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:35:27 crc kubenswrapper[4792]: I0318 15:35:27.161120 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:27 crc kubenswrapper[4792]: I0318 15:35:27.162771 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:27 crc kubenswrapper[4792]: I0318 15:35:27.162833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:27 crc kubenswrapper[4792]: I0318 15:35:27.162858 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:27 crc kubenswrapper[4792]: I0318 15:35:27.164051 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.164458 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.250388 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.350745 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.451296 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.552392 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.653497 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.754055 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: I0318 15:35:27.850317 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.854352 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:27 crc kubenswrapper[4792]: E0318 15:35:27.954453 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.043752 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.055037 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.156003 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.164057 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.165236 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.165307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.165343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.166305 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.166670 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.256205 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.356360 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.456780 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.512055 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.557155 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.658219 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.758635 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.859435 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: E0318 15:35:28.960228 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 15:35:28 crc kubenswrapper[4792]: I0318 15:35:28.996158 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.063769 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.063817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.063829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.063853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.063867 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.166510 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.166548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.166564 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.166586 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.166603 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.269659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.269708 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.269719 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.269738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.269750 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.373085 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.373120 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.373130 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.373149 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.373161 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.476149 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.476191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.476207 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.476228 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.476244 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.578736 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.578798 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.578818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.578840 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.578857 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.681529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.681581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.681601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.681624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.681640 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.784370 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.784408 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.784435 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.784449 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.784458 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.850833 4792 apiserver.go:52] "Watching apiserver" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.857269 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.858011 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4pndk","openshift-dns/node-resolver-tbjvb","openshift-machine-config-operator/machine-config-daemon-2wtm6","openshift-multus/multus-fqr6h","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-multus/multus-additional-cni-plugins-qnps4","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.858532 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.858648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.858855 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.858945 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.858954 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.859069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.859445 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.859560 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.859621 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.859684 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.859718 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.859845 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.860745 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.860802 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.864416 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.864838 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.865445 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.866755 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.867304 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.868235 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.868355 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.868503 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.868601 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.868727 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.868872 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869059 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869258 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869302 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869474 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869496 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869677 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869933 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.870071 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.870176 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.870420 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.870647 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.870768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.870897 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.871369 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.869302 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.872086 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.872423 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.872658 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.874629 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.888356 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.888396 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.888407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.888425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.888438 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:29Z","lastTransitionTime":"2026-03-18T15:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.889022 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.889104 4792 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.904371 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.917355 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.931787 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.932280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.932426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.932564 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.932842 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933015 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933164 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933307 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933443 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933582 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933710 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933859 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934036 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934168 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934318 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933056 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933052 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933363 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933770 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.933950 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934218 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934364 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934457 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934216 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934580 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.934806 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.935037 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.935360 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.935428 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.935617 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.935771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.935919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936061 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936081 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936321 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936607 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936763 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937067 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937330 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937461 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936361 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.936507 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937451 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937096 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937577 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937792 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937834 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937869 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937904 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.937939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938023 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938027 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938055 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938064 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938110 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938575 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938638 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938689 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938738 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938783 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938885 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.938931 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939008 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939060 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939106 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939151 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939197 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939216 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939244 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939291 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939338 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939385 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939482 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939531 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939575 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939585 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939617 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939632 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939668 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939715 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939756 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939799 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939847 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939890 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.939939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940039 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940060 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940107 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940151 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940196 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940242 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940290 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940333 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940373 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940518 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940552 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940607 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940655 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940704 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940751 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940838 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940884 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.940930 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941018 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941267 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941314 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941361 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941415 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941458 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941500 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941599 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941633 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941668 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941701 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941733 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941765 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941875 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.941921 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942033 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942086 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942132 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942346 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942386 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942456 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942532 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942569 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942605 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942636 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942659 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942682 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942706 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942728 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942749 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942644 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942772 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942794 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942815 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942861 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942867 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.942884 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943007 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943074 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943137 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943194 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943251 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943299 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943350 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943392 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943395 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943403 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943474 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943481 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943673 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943714 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.943751 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944074 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944101 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944110 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944142 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944177 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944215 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944251 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944307 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944449 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944486 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944574 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944624 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944658 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944857 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944887 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944952 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.944962 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945031 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945097 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945173 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945256 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945290 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945325 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945357 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945400 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945411 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945428 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945455 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945522 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945547 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945570 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945596 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945620 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945624 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945651 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945861 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945873 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945887 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.946070 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.947450 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.947640 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.947700 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.947985 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.948124 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.948195 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.948374 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.948469 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.949063 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.949099 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.949190 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.949363 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.949815 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.949904 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.950172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.950321 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.950452 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.950524 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.950945 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.951097 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.951229 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.951263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.951740 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.951850 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.952341 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.952376 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.953263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.953449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.953469 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.953788 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.953822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.953900 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.953932 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.953955 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:35:30.453885112 +0000 UTC m=+79.323214169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.954605 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.954124 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.954610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.954648 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.954698 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.954892 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.954888 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.955329 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.955462 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.955478 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.955659 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.955700 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.956525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.956938 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957454 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957737 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957838 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957819 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.957760 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.958211 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.959690 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.959737 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.959824 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960107 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.959914 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.945644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960250 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960284 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960501 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960559 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960616 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960690 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960744 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960791 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960836 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960891 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.960944 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961036 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961241 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961317 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961562 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961622 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961673 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961802 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.961930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.962392 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.962568 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.962845 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.962882 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.963135 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.963492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.963604 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.963645 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.963656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.963765 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.964057 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.964316 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.964327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.964362 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.964527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.964647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-kubelet\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965039 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-systemd\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovn-node-metrics-cert\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e51cef14-7d91-4e08-8045-831f7a9a65f8-proxy-tls\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965177 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965479 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-bin\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965685 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlg8\" (UniqueName: \"kubernetes.io/projected/1f709bbd-6cce-421b-90fe-8c9047004002-kube-api-access-vjlg8\") pod \"node-resolver-tbjvb\" (UID: \"1f709bbd-6cce-421b-90fe-8c9047004002\") " pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965774 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-netns\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.965874 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-conf-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966076 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-system-cni-dir\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966152 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966161 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966270 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-cni-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-k8s-cni-cncf-io\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966515 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966583 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-slash\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966616 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-cni-multus\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdtl\" (UniqueName: \"kubernetes.io/projected/241b9e3f-bd41-4fb2-a68a-9395a67feaae-kube-api-access-mkdtl\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f709bbd-6cce-421b-90fe-8c9047004002-hosts-file\") pod \"node-resolver-tbjvb\" (UID: \"1f709bbd-6cce-421b-90fe-8c9047004002\") " pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966746 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-os-release\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-kubelet\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-etc-kubernetes\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966848 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966917 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.966950 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-hostroot\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967049 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-netd\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967096 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqswl\" (UniqueName: \"kubernetes.io/projected/4e6f23ed-13da-466a-8c55-1043d6e0b748-kube-api-access-mqswl\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-cnibin\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-ovn-kubernetes\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967262 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-ovn\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967325 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-system-cni-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967344 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/241b9e3f-bd41-4fb2-a68a-9395a67feaae-cni-binary-copy\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-etc-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-config\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967405 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-env-overrides\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-socket-dir-parent\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-cni-bin\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967521 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967543 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-daemon-config\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-multus-certs\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-os-release\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-log-socket\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967738 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-script-lib\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.967805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e51cef14-7d91-4e08-8045-831f7a9a65f8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.967839 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968513 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cnibin\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968562 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44sr\" (UniqueName: \"kubernetes.io/projected/2c61302f-31a0-4ba3-99b0-e5206c848cd8-kube-api-access-s44sr\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968595 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.968677 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968705 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-var-lib-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e51cef14-7d91-4e08-8045-831f7a9a65f8-rootfs\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.968952 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-netns\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-node-log\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969130 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-systemd-units\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969261 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5nt\" (UniqueName: \"kubernetes.io/projected/e51cef14-7d91-4e08-8045-831f7a9a65f8-kube-api-access-cm5nt\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969537 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969627 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969690 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969717 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969777 4792 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969808 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969832 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.969934 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970015 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970045 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970068 4792 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970093 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970117 4792 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970142 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970173 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970207 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.970241 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.971055 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.971305 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.971532 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.971676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.972156 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.972572 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.972670 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.972885 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:30.472860755 +0000 UTC m=+79.342189712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.973074 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:30.473058622 +0000 UTC m=+79.342387569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.972413 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.972812 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.972881 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.973566 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.973592 4792 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.977680 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.980547 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.981383 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.981429 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.981491 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.981523 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.981546 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.981570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.981618 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:30.481595182 +0000 UTC m=+79.350924159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982117 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982157 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982166 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982214 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982238 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982252 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982262 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982272 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982284 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982294 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982304 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982313 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982322 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982331 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982340 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982350 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982359 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982368 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982377 4792 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982416 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982426 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982457 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982466 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982477 4792 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982486 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982496 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982507 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982516 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982524 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982534 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982545 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982554 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982563 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982572 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982581 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982590 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982601 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982611 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982620 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982629 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982638 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982647 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982658 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982667 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982677 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982686 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982696 4792 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982704 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982714 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982723 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982735 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982744 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982757 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982766 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982775 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982783 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982791 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982802 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982811 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982820 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982831 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982840 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982848 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982856 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982865 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982873 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982882 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982891 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982900 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982908 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982918 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982926 4792 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982937 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982940 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.982951 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983012 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983025 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983036 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983047 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983081 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983093 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983103 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983113 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983122 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983150 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983160 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983171 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983182 4792 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983192 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983204 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983233 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983248 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983261 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983274 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983306 4792 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983322 4792 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983507 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983335 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983928 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983943 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.983955 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984001 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984017 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984030 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984044 4792 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984055 4792 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984083 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984095 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984105 4792 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984127 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984154 4792 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984168 4792 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984180 4792 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984193 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984205 4792 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984239 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984251 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984260 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984270 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984320 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984454 4792 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.984517 4792 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.985359 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.990850 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.991246 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.991584 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.991760 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.991806 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.994311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.997035 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.997121 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.997130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.997459 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.997493 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.997512 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:29 crc kubenswrapper[4792]: E0318 15:35:29.997588 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:30.49756601 +0000 UTC m=+79.366895037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.998524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.999740 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:29 crc kubenswrapper[4792]: I0318 15:35:29.999842 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:29.999883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.000453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.000789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.000813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.000838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.000853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.000863 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.002123 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.002191 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.002769 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.003509 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.003669 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.003932 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.004221 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.004277 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.004379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.004526 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.005471 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.011461 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.012328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.012484 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.012788 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.012800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.012949 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.013289 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.013340 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.013361 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.013728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.013768 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.013821 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.013845 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.014462 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.014483 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.014863 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.014991 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.026219 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.026934 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.037764 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.044287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.046289 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.046685 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.061117 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085291 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-netd\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085382 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqswl\" (UniqueName: \"kubernetes.io/projected/4e6f23ed-13da-466a-8c55-1043d6e0b748-kube-api-access-mqswl\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085461 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-cnibin\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085536 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-ovn-kubernetes\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085596 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085669 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-ovn\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085796 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-system-cni-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085858 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/241b9e3f-bd41-4fb2-a68a-9395a67feaae-cni-binary-copy\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.085920 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-daemon-config\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086013 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-multus-certs\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086081 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-os-release\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-etc-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-netd\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086221 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-config\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-ovn-kubernetes\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-env-overrides\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086333 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-socket-dir-parent\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-cni-bin\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-log-socket\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086435 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cnibin\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44sr\" (UniqueName: \"kubernetes.io/projected/2c61302f-31a0-4ba3-99b0-e5206c848cd8-kube-api-access-s44sr\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086488 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-script-lib\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e51cef14-7d91-4e08-8045-831f7a9a65f8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-var-lib-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e51cef14-7d91-4e08-8045-831f7a9a65f8-rootfs\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-netns\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086617 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-node-log\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-systemd-units\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5nt\" (UniqueName: \"kubernetes.io/projected/e51cef14-7d91-4e08-8045-831f7a9a65f8-kube-api-access-cm5nt\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e51cef14-7d91-4e08-8045-831f7a9a65f8-proxy-tls\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086752 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-kubelet\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-systemd\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovn-node-metrics-cert\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-env-overrides\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-system-cni-dir\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086884 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-system-cni-dir\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-bin\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086905 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-bin\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086924 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlg8\" (UniqueName: \"kubernetes.io/projected/1f709bbd-6cce-421b-90fe-8c9047004002-kube-api-access-vjlg8\") pod \"node-resolver-tbjvb\" (UID: \"1f709bbd-6cce-421b-90fe-8c9047004002\") " pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086954 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-netns\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086993 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-conf-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-cni-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087064 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-k8s-cni-cncf-io\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-slash\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-cni-multus\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdtl\" (UniqueName: \"kubernetes.io/projected/241b9e3f-bd41-4fb2-a68a-9395a67feaae-kube-api-access-mkdtl\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087158 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-etc-kubernetes\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087178 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087220 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f709bbd-6cce-421b-90fe-8c9047004002-hosts-file\") pod \"node-resolver-tbjvb\" (UID: \"1f709bbd-6cce-421b-90fe-8c9047004002\") " pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-os-release\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087265 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-kubelet\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-hostroot\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087309 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087336 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-log-socket\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087358 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087454 4792 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087469 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e51cef14-7d91-4e08-8045-831f7a9a65f8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087479 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-cni-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087492 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-etc-kubernetes\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087539 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-socket-dir-parent\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-systemd\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087562 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e51cef14-7d91-4e08-8045-831f7a9a65f8-rootfs\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087585 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-netns\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087583 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-k8s-cni-cncf-io\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087441 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cnibin\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087619 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f709bbd-6cce-421b-90fe-8c9047004002-hosts-file\") pod \"node-resolver-tbjvb\" (UID: \"1f709bbd-6cce-421b-90fe-8c9047004002\") " pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086933 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-systemd-units\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087505 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-multus-certs\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087697 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-node-log\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.086094 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-script-lib\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087853 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-os-release\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087881 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-conf-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.087905 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-kubelet\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-kubelet\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-ovn\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088090 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-hostroot\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088140 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-system-cni-dir\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088175 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-run-netns\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088207 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-cni-bin\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-var-lib-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088254 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-slash\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088364 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/241b9e3f-bd41-4fb2-a68a-9395a67feaae-multus-daemon-config\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-host-var-lib-cni-multus\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/241b9e3f-bd41-4fb2-a68a-9395a67feaae-cni-binary-copy\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088583 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-os-release\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088461 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/241b9e3f-bd41-4fb2-a68a-9395a67feaae-cnibin\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088610 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-etc-openvswitch\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088803 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088874 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.088933 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089078 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089125 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089135 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c61302f-31a0-4ba3-99b0-e5206c848cd8-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089144 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089198 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089226 4792 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089242 4792 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089255 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089266 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089285 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089307 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089324 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089335 4792 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089387 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089397 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089408 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089420 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089431 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089440 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089449 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089459 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089469 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089479 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089488 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089500 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089522 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089538 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c61302f-31a0-4ba3-99b0-e5206c848cd8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089756 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089770 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089781 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089790 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089799 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089809 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089818 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089912 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089927 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.089938 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090007 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090058 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090068 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090077 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090086 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090094 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090103 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090113 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090122 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090131 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090141 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090156 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090176 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090191 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090205 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090217 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090228 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.090790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-config\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.091891 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e51cef14-7d91-4e08-8045-831f7a9a65f8-proxy-tls\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.095426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovn-node-metrics-cert\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.102925 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.102955 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.102965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.103018 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.103027 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.103253 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqswl\" (UniqueName: \"kubernetes.io/projected/4e6f23ed-13da-466a-8c55-1043d6e0b748-kube-api-access-mqswl\") pod \"ovnkube-node-4pndk\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.105590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5nt\" (UniqueName: \"kubernetes.io/projected/e51cef14-7d91-4e08-8045-831f7a9a65f8-kube-api-access-cm5nt\") pod \"machine-config-daemon-2wtm6\" (UID: \"e51cef14-7d91-4e08-8045-831f7a9a65f8\") " pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.110707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44sr\" (UniqueName: \"kubernetes.io/projected/2c61302f-31a0-4ba3-99b0-e5206c848cd8-kube-api-access-s44sr\") pod \"multus-additional-cni-plugins-qnps4\" (UID: \"2c61302f-31a0-4ba3-99b0-e5206c848cd8\") " pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.111892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlg8\" (UniqueName: \"kubernetes.io/projected/1f709bbd-6cce-421b-90fe-8c9047004002-kube-api-access-vjlg8\") pod \"node-resolver-tbjvb\" (UID: \"1f709bbd-6cce-421b-90fe-8c9047004002\") " pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.113384 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdtl\" (UniqueName: \"kubernetes.io/projected/241b9e3f-bd41-4fb2-a68a-9395a67feaae-kube-api-access-mkdtl\") pod \"multus-fqr6h\" (UID: \"241b9e3f-bd41-4fb2-a68a-9395a67feaae\") " pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.189677 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.204674 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.204778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.204886 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.204994 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.205132 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.207806 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 15:35:30 crc kubenswrapper[4792]: W0318 15:35:30.222744 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f7276a60e29b1c14de4ea3b30fd472398a8084aa5f75003ddd190e3318e1b949 WatchSource:0}: Error finding container f7276a60e29b1c14de4ea3b30fd472398a8084aa5f75003ddd190e3318e1b949: Status 404 returned error can't find the container with id f7276a60e29b1c14de4ea3b30fd472398a8084aa5f75003ddd190e3318e1b949 Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.230345 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tbjvb" Mar 18 15:35:30 crc kubenswrapper[4792]: W0318 15:35:30.244379 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f709bbd_6cce_421b_90fe_8c9047004002.slice/crio-0afd9d79cd68af7b5bf77d763b098982e08334253b315a60b5f56a598cde6d3e WatchSource:0}: Error finding container 0afd9d79cd68af7b5bf77d763b098982e08334253b315a60b5f56a598cde6d3e: Status 404 returned error can't find the container with id 0afd9d79cd68af7b5bf77d763b098982e08334253b315a60b5f56a598cde6d3e Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.262424 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.290986 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.307319 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.307370 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.307384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.307402 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.307415 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.320780 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.339403 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qnps4" Mar 18 15:35:30 crc kubenswrapper[4792]: W0318 15:35:30.341215 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51cef14_7d91_4e08_8045_831f7a9a65f8.slice/crio-f43aca97e9da6d76fd3bf53f12911eb7468a1e132c0ca9b0ff20962484d48ebc WatchSource:0}: Error finding container f43aca97e9da6d76fd3bf53f12911eb7468a1e132c0ca9b0ff20962484d48ebc: Status 404 returned error can't find the container with id f43aca97e9da6d76fd3bf53f12911eb7468a1e132c0ca9b0ff20962484d48ebc Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.351721 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fqr6h" Mar 18 15:35:30 crc kubenswrapper[4792]: W0318 15:35:30.380379 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241b9e3f_bd41_4fb2_a68a_9395a67feaae.slice/crio-cfe27d21edbba763d1f1b642572447d8f75520fcc3666badd41f960d45965155 WatchSource:0}: Error finding container cfe27d21edbba763d1f1b642572447d8f75520fcc3666badd41f960d45965155: Status 404 returned error can't find the container with id cfe27d21edbba763d1f1b642572447d8f75520fcc3666badd41f960d45965155 Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.409158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.409200 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.409213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.409231 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.409245 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.493372 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.493484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.493515 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493555 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:35:31.493513684 +0000 UTC m=+80.362842621 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493634 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493655 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493667 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493711 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:31.49369638 +0000 UTC m=+80.363025317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493745 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493762 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.493655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493786 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:31.493779172 +0000 UTC m=+80.363108109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.493803 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:31.493796993 +0000 UTC m=+80.363125930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.511121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.511153 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.511162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.511177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.511189 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.595093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.595268 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.595287 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.595298 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:30 crc kubenswrapper[4792]: E0318 15:35:30.595351 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:31.595337668 +0000 UTC m=+80.464666605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.613682 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.613727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.613739 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.613757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.613770 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.716993 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.717058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.717082 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.717110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.717132 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.819349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.819402 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.819417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.819434 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.819448 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.922520 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.922575 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.922591 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.922610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:30 crc kubenswrapper[4792]: I0318 15:35:30.922627 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:30Z","lastTransitionTime":"2026-03-18T15:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.025488 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.025779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.025790 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.025805 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.025816 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.128765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.128799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.128808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.128820 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.128829 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.172452 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tbjvb" event={"ID":"1f709bbd-6cce-421b-90fe-8c9047004002","Type":"ContainerStarted","Data":"28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.172507 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tbjvb" event={"ID":"1f709bbd-6cce-421b-90fe-8c9047004002","Type":"ContainerStarted","Data":"0afd9d79cd68af7b5bf77d763b098982e08334253b315a60b5f56a598cde6d3e"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.173962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.174029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d7e42a16fbdf251825ee966aab1d5658dbbe9c284bd3de3c83d96ae97b003e62"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.175307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerStarted","Data":"ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.175360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerStarted","Data":"cfe27d21edbba763d1f1b642572447d8f75520fcc3666badd41f960d45965155"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.177097 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c61302f-31a0-4ba3-99b0-e5206c848cd8" containerID="f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5" exitCode=0 Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.177170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerDied","Data":"f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.177200 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerStarted","Data":"18e7a8bd5d1b429d62275b4d9156398c4d1182cd1904a2474902493ffaa6a45d"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.179083 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f7276a60e29b1c14de4ea3b30fd472398a8084aa5f75003ddd190e3318e1b949"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.181384 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.181438 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.181453 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"f43aca97e9da6d76fd3bf53f12911eb7468a1e132c0ca9b0ff20962484d48ebc"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.184340 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9" exitCode=0 Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.184449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.184478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"bd75d70e877e892950a35478814bf2eeedf8c188abd9ad1b0b4623f55b206380"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.186699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.186749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.186763 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4953efaad001224c8ae78cc50103c5340defa5b4833b1f33bb66c40707d9e630"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.192866 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.205914 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.220304 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.230256 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.230293 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.230302 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.230317 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.230326 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.232945 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.245341 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.259886 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.271421 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.282464 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.291845 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.305662 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.322573 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.332351 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.332388 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.332400 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.332417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.332429 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.337422 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.351258 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.368497 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.381882 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.404231 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.418196 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.434311 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.434343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.434351 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.434367 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.434376 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.436192 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.448001 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.461622 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.479188 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.504744 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.504894 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505002 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:35:33.504945381 +0000 UTC m=+82.374274318 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.505080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505114 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505229 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:33.50520789 +0000 UTC m=+82.374536827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.505147 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505265 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505268 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505338 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:33.505324953 +0000 UTC m=+82.374654100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505366 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505385 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.505463 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:33.505440267 +0000 UTC m=+82.374769204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.508600 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.542053 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.542127 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.542136 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.542152 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.542162 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.606222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.606464 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.606721 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.606744 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.606812 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:33.606793427 +0000 UTC m=+82.476122364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.644426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.644476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.644488 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.644505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.644515 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.747104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.747152 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.747160 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.747173 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.747183 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.852661 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.852714 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.852728 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.852744 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.852755 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.853399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.853474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.853555 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.853612 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.853702 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.853760 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.858601 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.859423 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.860178 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.860817 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.862168 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.862701 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.863804 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.864399 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.865419 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.866068 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.866887 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.867610 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.870573 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.871219 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.878589 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.879303 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.880212 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.880758 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.881511 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.883412 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.883653 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.884293 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.884993 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.886032 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.886778 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.888392 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.889749 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.890707 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.891415 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.892920 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.894014 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.894691 4792 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.894831 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.898001 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.898884 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.899540 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.901558 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.901729 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.903313 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.904249 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.905770 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.907338 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.908046 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.909172 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.910670 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.911534 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.912780 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.913119 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.913593 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.915073 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.916154 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.917451 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.918049 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.918508 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.919480 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.920091 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.921061 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.927796 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.935270 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.935309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.935324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.935342 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.935353 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.946754 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.955134 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.960402 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.960442 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.960458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.960473 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.960485 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.961165 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.972377 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.975302 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.975325 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.975332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.975347 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.975357 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.977886 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: E0318 15:35:31.986735 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.993050 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.993092 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.993103 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.993118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.993130 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:31Z","lastTransitionTime":"2026-03-18T15:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:31 crc kubenswrapper[4792]: I0318 15:35:31.994490 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: E0318 15:35:32.004062 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.008582 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.013487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.013536 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.013548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.013566 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.013579 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.023566 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: E0318 15:35:32.029182 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: E0318 15:35:32.029287 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.030736 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.030753 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.030763 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.030780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.030792 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.034729 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.133191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.133224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.133242 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.133255 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.133264 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.201422 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c61302f-31a0-4ba3-99b0-e5206c848cd8" containerID="ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925" exitCode=0 Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.201529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerDied","Data":"ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.209929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.209986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.209998 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.210014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.210023 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.214712 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.234006 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.235223 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.235245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.235254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.235266 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.235276 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.248431 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.262065 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.275603 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.296016 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.322842 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.336023 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.337638 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.337669 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.337681 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.337697 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.337708 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.352262 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.369499 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.384782 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.440771 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.440808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.440816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.440830 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.440842 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.543487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.543538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.543555 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.543573 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.543585 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.645880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.645915 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.645928 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.645941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.645951 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.748094 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.748143 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.748151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.748167 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.748183 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.851175 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.851236 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.851250 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.851268 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.851281 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.954006 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.954070 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.954090 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.954114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:32 crc kubenswrapper[4792]: I0318 15:35:32.954136 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:32Z","lastTransitionTime":"2026-03-18T15:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.057878 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.057931 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.057941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.057958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.057982 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.160379 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.160422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.160435 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.160453 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.160466 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.216632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.219105 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c61302f-31a0-4ba3-99b0-e5206c848cd8" containerID="4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf" exitCode=0 Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.219170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerDied","Data":"4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.220753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.238177 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.258756 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.267083 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.267115 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.267125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.267140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.267149 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.274797 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.285846 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.302890 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.316820 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.331472 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.343884 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.356249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.366555 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.369234 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.369295 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.369310 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.369325 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.369335 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.378583 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.393790 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.403736 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.417727 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.437398 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.452634 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.465092 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.471672 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.471711 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.471720 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.471733 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.471743 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.481933 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.493705 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.503453 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.514785 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.528985 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.531303 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.531464 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531481 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:35:37.531460403 +0000 UTC m=+86.400789340 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.531537 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.531569 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531642 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531679 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531700 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:37.53168474 +0000 UTC m=+86.401013667 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531720 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:37.531709191 +0000 UTC m=+86.401038128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531744 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531794 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531813 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.531896 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:37.531872236 +0000 UTC m=+86.401201163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.574075 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.574174 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.574190 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.574298 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.574315 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.632518 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.632674 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.632776 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.632789 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.632847 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:37.632831683 +0000 UTC m=+86.502160620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.676752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.676802 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.676816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.676833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.676845 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.769107 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.778960 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.779030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.779071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.779091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.779106 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.854441 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.854633 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.854468 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.854747 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.854450 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:33 crc kubenswrapper[4792]: E0318 15:35:33.854836 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.882260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.882321 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.882343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.882374 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.882398 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.985382 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.985436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.985458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.985488 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:33 crc kubenswrapper[4792]: I0318 15:35:33.985508 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:33Z","lastTransitionTime":"2026-03-18T15:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.088602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.088685 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.088710 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.088742 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.088768 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.191660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.191715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.191729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.191749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.191764 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.227393 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c61302f-31a0-4ba3-99b0-e5206c848cd8" containerID="2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52" exitCode=0 Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.227492 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerDied","Data":"2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.251888 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.268849 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.286088 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.294651 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.294685 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.294694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.294709 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.294719 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.303241 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.316482 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.327594 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.341065 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.353523 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.366681 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.377336 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.395270 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:34Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.396808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.396837 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.396847 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.396861 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.396872 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.498961 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.499023 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.499034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.499054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.499066 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.602374 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.602426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.602438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.602457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.602469 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.705575 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.705624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.705633 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.705646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.705655 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.808619 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.808669 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.808707 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.808731 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.808744 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.910655 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.910698 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.910714 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.910732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:34 crc kubenswrapper[4792]: I0318 15:35:34.910744 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:34Z","lastTransitionTime":"2026-03-18T15:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.012659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.012715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.012729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.012751 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.012765 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.115478 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.115529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.115541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.115559 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.115571 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.217616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.217959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.218199 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.218396 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.218561 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.235163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.237768 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c61302f-31a0-4ba3-99b0-e5206c848cd8" containerID="55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf" exitCode=0 Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.237793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerDied","Data":"55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.259639 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.283435 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.298844 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.314525 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.321419 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.321455 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.321466 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.321482 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.321493 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.327493 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.337958 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.351101 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.362082 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.374057 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.386932 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.397398 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.423667 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.423702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.423712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.423728 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.423738 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.526530 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.526592 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.526605 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.526622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.526635 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.588536 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-772vs"] Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.588960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.591380 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.591615 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.592623 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.593442 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.602136 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.618302 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.629900 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.629965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.630046 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.630070 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.630102 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.631392 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.639780 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.651342 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.651386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49d0b1b6-4001-4571-8af7-57a361c58c49-host\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.651547 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49d0b1b6-4001-4571-8af7-57a361c58c49-serviceca\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.651580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbctn\" (UniqueName: \"kubernetes.io/projected/49d0b1b6-4001-4571-8af7-57a361c58c49-kube-api-access-nbctn\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.663293 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.681840 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.698443 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.709602 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.725233 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.732106 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.732141 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.732151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.732166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.732176 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.740715 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.752688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49d0b1b6-4001-4571-8af7-57a361c58c49-host\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.752741 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49d0b1b6-4001-4571-8af7-57a361c58c49-serviceca\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.752767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbctn\" (UniqueName: \"kubernetes.io/projected/49d0b1b6-4001-4571-8af7-57a361c58c49-kube-api-access-nbctn\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.752799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49d0b1b6-4001-4571-8af7-57a361c58c49-host\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.753717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49d0b1b6-4001-4571-8af7-57a361c58c49-serviceca\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.759463 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:35Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.770465 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbctn\" (UniqueName: \"kubernetes.io/projected/49d0b1b6-4001-4571-8af7-57a361c58c49-kube-api-access-nbctn\") pod \"node-ca-772vs\" (UID: \"49d0b1b6-4001-4571-8af7-57a361c58c49\") " pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.836682 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.836732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.836746 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.836767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.836782 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.853663 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.853731 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.853757 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:35 crc kubenswrapper[4792]: E0318 15:35:35.853958 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:35 crc kubenswrapper[4792]: E0318 15:35:35.854139 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:35 crc kubenswrapper[4792]: E0318 15:35:35.854267 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.902487 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-772vs" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.939916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.939962 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.939993 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.940011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:35 crc kubenswrapper[4792]: I0318 15:35:35.940024 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:35Z","lastTransitionTime":"2026-03-18T15:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.043945 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.044450 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.044462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.044480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.044491 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.147091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.147451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.147469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.147486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.147496 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.245802 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c61302f-31a0-4ba3-99b0-e5206c848cd8" containerID="76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752" exitCode=0 Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.245873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerDied","Data":"76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.247591 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-772vs" event={"ID":"49d0b1b6-4001-4571-8af7-57a361c58c49","Type":"ContainerStarted","Data":"67573d0bc7546abd9def936f1c7f4dd25e7326b591c0df5daf19b7edabdad733"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.249308 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.249504 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.249518 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.249541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.249560 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.260694 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.274502 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.288629 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.302700 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.317093 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.328935 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.353932 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.354703 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.354744 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.354757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.354776 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.354788 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.366694 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.376793 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.390599 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.402140 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.410698 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:36Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.459017 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.459041 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.459050 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.459062 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.459071 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.561778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.561832 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.561844 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.561862 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.561874 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.666072 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.666160 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.666184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.666220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.666245 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.769025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.769067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.769078 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.769096 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.769107 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.872681 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.873095 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.873112 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.873133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.873146 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.975764 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.975801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.975810 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.975822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:36 crc kubenswrapper[4792]: I0318 15:35:36.975831 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:36Z","lastTransitionTime":"2026-03-18T15:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.078834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.078879 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.078888 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.078904 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.078917 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.181025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.181070 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.181082 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.181098 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.181114 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.252405 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-772vs" event={"ID":"49d0b1b6-4001-4571-8af7-57a361c58c49","Type":"ContainerStarted","Data":"b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.256545 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" event={"ID":"2c61302f-31a0-4ba3-99b0-e5206c848cd8","Type":"ContainerStarted","Data":"2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.260530 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.260803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.268314 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.279355 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.282869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.282902 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.282911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.282925 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.282934 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.285796 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.296226 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.311183 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.329203 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.343183 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.354281 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.370893 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.384823 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.384854 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.384864 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.384877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.384886 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.392533 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.408649 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.422108 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.442723 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.492855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.492892 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.492900 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.492914 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.492924 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.493958 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.519306 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.532878 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.547820 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.565612 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.572013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.572120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.572141 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.572160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572263 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572303 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572298 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:35:45.572256988 +0000 UTC m=+94.441585965 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572268 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572349 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572357 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:45.572342921 +0000 UTC m=+94.441671898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572359 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572384 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:45.572372172 +0000 UTC m=+94.441701149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.572424 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:45.572411493 +0000 UTC m=+94.441740430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.577768 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.590545 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.594904 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.594930 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.594939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.594953 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.594962 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.604141 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.618370 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.633009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.645807 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.660370 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:37Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.673039 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.673214 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.673235 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.673246 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.673306 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:35:45.673290717 +0000 UTC m=+94.542619654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.696877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.696931 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.696944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.696962 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.697003 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.804662 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.804702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.805130 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.806190 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.806269 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.854180 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.854216 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.854338 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.854414 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.854463 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:37 crc kubenswrapper[4792]: E0318 15:35:37.854639 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.909722 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.909773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.909786 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.909806 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:37 crc kubenswrapper[4792]: I0318 15:35:37.909848 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:37Z","lastTransitionTime":"2026-03-18T15:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.013104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.013162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.013180 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.013203 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.013220 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.116280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.116333 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.116346 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.116365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.116378 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.218653 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.218714 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.218729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.218747 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.218761 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.266785 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.266845 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.291840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.310637 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.321696 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.321760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.321776 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.321798 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.321813 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.333418 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.346026 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.362145 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.393151 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.409872 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.424048 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.424097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.424108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.424125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.424136 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.430336 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.443740 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.455580 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.469527 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.483773 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.492622 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.526662 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.526709 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.526722 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.526738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.526751 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.629302 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.629403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.629425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.629448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.629491 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.732579 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.732629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.732646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.732671 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.732689 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.836324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.836373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.836385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.836402 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.836414 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.940422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.940455 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.940464 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.940477 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:38 crc kubenswrapper[4792]: I0318 15:35:38.940485 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:38Z","lastTransitionTime":"2026-03-18T15:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.043558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.043605 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.043620 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.043642 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.043660 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.146021 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.146071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.146089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.146152 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.146169 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.250667 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.250702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.250712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.250727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.250737 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.353448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.353492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.353502 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.353515 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.353525 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.455497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.455538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.455548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.455563 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.455574 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.579444 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.579498 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.579554 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.579578 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.579589 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.682692 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.683166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.683187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.683217 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.683241 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.785684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.785737 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.785753 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.785776 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.785795 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.853653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.853722 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.853818 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:39 crc kubenswrapper[4792]: E0318 15:35:39.853810 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:39 crc kubenswrapper[4792]: E0318 15:35:39.853931 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:39 crc kubenswrapper[4792]: E0318 15:35:39.853994 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.878354 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.879168 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:35:39 crc kubenswrapper[4792]: E0318 15:35:39.879389 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.887555 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.887593 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.887606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.887622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.887634 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.990863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.990939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.990957 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.991049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:39 crc kubenswrapper[4792]: I0318 15:35:39.991069 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:39Z","lastTransitionTime":"2026-03-18T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.094322 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.094398 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.094421 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.094453 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.094477 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.197293 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.197338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.197351 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.197367 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.197379 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.273083 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/0.log" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.276838 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8" exitCode=1 Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.276898 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.277508 4792 scope.go:117] "RemoveContainer" containerID="529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.277563 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:35:40 crc kubenswrapper[4792]: E0318 15:35:40.277741 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.290990 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.302338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.302378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.302391 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.302409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.302421 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.309460 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.330522 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.342126 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.351821 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.364686 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.376805 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.390409 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.404042 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.406234 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.406260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.406268 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.406284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.406294 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.417717 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.433296 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.444863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.460898 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:40Z\\\",\\\"message\\\":\\\" event handler 3 for removal\\\\nI0318 15:35:39.759751 6367 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:35:39.759768 6367 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:35:39.759781 6367 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:35:39.759807 6367 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:35:39.760657 6367 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 15:35:39.760682 6367 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 15:35:39.760729 6367 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:35:39.760771 6367 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:35:39.760803 6367 factory.go:656] Stopping watch factory\\\\nI0318 15:35:39.760800 6367 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:35:39.760818 6367 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:39.760823 6367 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:35:39.760833 6367 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:35:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:40Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.509311 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.509350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.509361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.509377 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.509389 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.612228 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.612284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.612302 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.612323 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.612336 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.723413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.723454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.723465 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.723485 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.723497 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.826250 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.826296 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.826307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.826323 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.826333 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.928371 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.928403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.928411 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.928423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:40 crc kubenswrapper[4792]: I0318 15:35:40.928432 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:40Z","lastTransitionTime":"2026-03-18T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.030756 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.030817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.030830 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.030849 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.030864 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.133425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.133480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.133498 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.133523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.133540 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.236163 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.236214 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.236225 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.236241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.236256 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.282459 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/1.log" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.283330 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/0.log" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.287293 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b" exitCode=1 Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.287414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.287497 4792 scope.go:117] "RemoveContainer" containerID="529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.288025 4792 scope.go:117] "RemoveContainer" containerID="914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b" Mar 18 15:35:41 crc kubenswrapper[4792]: E0318 15:35:41.288189 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.303466 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.318414 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.329661 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.338848 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.338889 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.338903 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.338918 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.338926 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.339540 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.350517 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.362454 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.372961 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.391511 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:40Z\\\",\\\"message\\\":\\\" event handler 3 for removal\\\\nI0318 15:35:39.759751 6367 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:35:39.759768 6367 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:35:39.759781 6367 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:35:39.759807 6367 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:35:39.760657 6367 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 15:35:39.760682 6367 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 15:35:39.760729 6367 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:35:39.760771 6367 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:35:39.760803 6367 factory.go:656] Stopping watch factory\\\\nI0318 15:35:39.760800 6367 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:35:39.760818 6367 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:39.760823 6367 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:35:39.760833 6367 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:35:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.408591 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.419098 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.430863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.441866 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.442048 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.442089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.442313 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.442416 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.442504 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.453824 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.525214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh"] Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.525765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.528710 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.529259 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.540121 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.544799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.544847 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.544856 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.544868 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.544878 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.557439 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.572563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.585835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.600518 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.611043 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.622327 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.630412 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtwbm\" (UniqueName: \"kubernetes.io/projected/5263fba0-5316-48e0-a254-a4c598e30f02-kube-api-access-gtwbm\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.630459 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5263fba0-5316-48e0-a254-a4c598e30f02-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.630486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5263fba0-5316-48e0-a254-a4c598e30f02-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.630506 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5263fba0-5316-48e0-a254-a4c598e30f02-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.645041 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:40Z\\\",\\\"message\\\":\\\" event handler 3 for removal\\\\nI0318 15:35:39.759751 6367 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:35:39.759768 6367 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:35:39.759781 6367 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:35:39.759807 6367 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:35:39.760657 6367 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 15:35:39.760682 6367 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 15:35:39.760729 6367 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:35:39.760771 6367 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:35:39.760803 6367 factory.go:656] Stopping watch factory\\\\nI0318 15:35:39.760800 6367 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:35:39.760818 6367 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:39.760823 6367 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:35:39.760833 6367 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:35:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.647350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.647386 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.647395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.647410 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.647419 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.659531 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.672133 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.690379 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.705346 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.719714 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.731486 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtwbm\" (UniqueName: \"kubernetes.io/projected/5263fba0-5316-48e0-a254-a4c598e30f02-kube-api-access-gtwbm\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.731542 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5263fba0-5316-48e0-a254-a4c598e30f02-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.731577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5263fba0-5316-48e0-a254-a4c598e30f02-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.731598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5263fba0-5316-48e0-a254-a4c598e30f02-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.732947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5263fba0-5316-48e0-a254-a4c598e30f02-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.733403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5263fba0-5316-48e0-a254-a4c598e30f02-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.734412 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.740268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5263fba0-5316-48e0-a254-a4c598e30f02-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.749676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.749732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.749744 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.749760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.749772 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.751214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtwbm\" (UniqueName: \"kubernetes.io/projected/5263fba0-5316-48e0-a254-a4c598e30f02-kube-api-access-gtwbm\") pod \"ovnkube-control-plane-749d76644c-x8lmh\" (UID: \"5263fba0-5316-48e0-a254-a4c598e30f02\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.841029 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.851689 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.851743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.851761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.851786 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.851811 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.853177 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:41 crc kubenswrapper[4792]: E0318 15:35:41.853290 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.853369 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:41 crc kubenswrapper[4792]: E0318 15:35:41.853439 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.853528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:41 crc kubenswrapper[4792]: E0318 15:35:41.853583 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:41 crc kubenswrapper[4792]: W0318 15:35:41.856054 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5263fba0_5316_48e0_a254_a4c598e30f02.slice/crio-a7485805fe6e31514a173c824c6c1ed4fb44f12b960e1c9b068a3b2a655f1eeb WatchSource:0}: Error finding container a7485805fe6e31514a173c824c6c1ed4fb44f12b960e1c9b068a3b2a655f1eeb: Status 404 returned error can't find the container with id a7485805fe6e31514a173c824c6c1ed4fb44f12b960e1c9b068a3b2a655f1eeb Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.869427 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.897646 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.917995 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.932949 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.953921 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.953959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.953983 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.953999 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.954012 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:41Z","lastTransitionTime":"2026-03-18T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.953998 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.965295 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.978308 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:41 crc kubenswrapper[4792]: I0318 15:35:41.991304 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.010649 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:40Z\\\",\\\"message\\\":\\\" event handler 3 for removal\\\\nI0318 15:35:39.759751 6367 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:35:39.759768 6367 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:35:39.759781 6367 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:35:39.759807 6367 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:35:39.760657 6367 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 15:35:39.760682 6367 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 15:35:39.760729 6367 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:35:39.760771 6367 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:35:39.760803 6367 factory.go:656] Stopping watch factory\\\\nI0318 15:35:39.760800 6367 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:35:39.760818 6367 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:39.760823 6367 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:35:39.760833 6367 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:35:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.026282 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.038585 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.050831 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.056838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.056878 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.056889 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.056908 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.056918 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.063997 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.073205 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.160101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.160145 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.160156 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.160171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.160184 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.250858 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.250897 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.250906 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.250923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.250932 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.269859 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.273672 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rpvb6"] Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.274386 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.274492 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.276483 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.276521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.276530 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.276546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.276556 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.292131 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/1.log" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.296212 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" event={"ID":"5263fba0-5316-48e0-a254-a4c598e30f02","Type":"ContainerStarted","Data":"a7485805fe6e31514a173c824c6c1ed4fb44f12b960e1c9b068a3b2a655f1eeb"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.296312 4792 scope.go:117] "RemoveContainer" containerID="914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.296563 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.301281 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.302074 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.307001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.307036 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.307045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.307060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.307070 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.316252 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.322252 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.326593 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.326658 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.326680 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.326706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.326725 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.336080 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.339660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.339716 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwl5\" (UniqueName: \"kubernetes.io/projected/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-kube-api-access-snwl5\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.349624 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.354643 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.355928 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.355983 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.356002 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.356019 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.356032 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.374276 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.374361 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.374511 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.376561 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.376618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.376631 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.376648 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.376662 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.396385 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.410125 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.426547 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.441172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.441220 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwl5\" (UniqueName: \"kubernetes.io/projected/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-kube-api-access-snwl5\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.441400 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.441500 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:35:42.941480838 +0000 UTC m=+91.810809765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.442386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.455609 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.462013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwl5\" (UniqueName: \"kubernetes.io/projected/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-kube-api-access-snwl5\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.469178 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.479137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.479171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.479179 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.479192 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.479203 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.485339 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.500563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.523430 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529b492501dc64b9256b6d9991bddbd6a0f02ae6b55c932aaa7eec9279f71ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:40Z\\\",\\\"message\\\":\\\" event handler 3 for removal\\\\nI0318 15:35:39.759751 6367 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:35:39.759768 6367 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:35:39.759781 6367 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:35:39.759807 6367 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:35:39.760657 6367 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 15:35:39.760682 6367 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 15:35:39.760729 6367 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 15:35:39.760771 6367 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:35:39.760803 6367 factory.go:656] Stopping watch factory\\\\nI0318 15:35:39.760800 6367 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 15:35:39.760818 6367 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:39.760823 6367 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 15:35:39.760833 6367 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 15:35:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.539035 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.554227 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.572078 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.581601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.581654 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.581663 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.581679 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.581690 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.596456 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.631333 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.652417 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.671459 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.684266 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.684321 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.684332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.684350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.684362 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.687733 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.699894 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.715825 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.729188 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.746982 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.758067 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.772501 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.783726 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.786295 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.786327 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.786337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.786351 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.786360 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.805492 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.865349 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.889327 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.889397 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.889420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.889456 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.889483 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.945801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.946090 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:42 crc kubenswrapper[4792]: E0318 15:35:42.946226 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:35:43.94619235 +0000 UTC m=+92.815521337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.993132 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.993196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.993214 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.993239 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:42 crc kubenswrapper[4792]: I0318 15:35:42.993256 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:42Z","lastTransitionTime":"2026-03-18T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.096183 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.096237 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.096273 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.096294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.096310 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.199824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.199871 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.199880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.199895 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.199904 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.301337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.301370 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.301381 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.301394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.301403 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.301800 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" event={"ID":"5263fba0-5316-48e0-a254-a4c598e30f02","Type":"ContainerStarted","Data":"e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.301922 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" event={"ID":"5263fba0-5316-48e0-a254-a4c598e30f02","Type":"ContainerStarted","Data":"101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.324122 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.340300 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.356263 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.381418 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.402068 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.403793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.403855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.403868 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.403888 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.403905 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.416788 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.435738 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.451145 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.464073 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.480497 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.495785 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.506828 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.506888 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.506908 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.506934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.506952 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.510767 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.526496 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.540327 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.555535 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.566578 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.609438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.609506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.609524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.609552 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.609571 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.712323 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.712378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.712389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.712406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.712417 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.815742 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.815797 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.815810 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.815828 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.815846 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.854182 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.854234 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.854322 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.854434 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:43 crc kubenswrapper[4792]: E0318 15:35:43.854421 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:43 crc kubenswrapper[4792]: E0318 15:35:43.854567 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:43 crc kubenswrapper[4792]: E0318 15:35:43.854624 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:43 crc kubenswrapper[4792]: E0318 15:35:43.854676 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.922137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.922194 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.922210 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.922230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.922243 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:43Z","lastTransitionTime":"2026-03-18T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:43 crc kubenswrapper[4792]: I0318 15:35:43.957574 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:43 crc kubenswrapper[4792]: E0318 15:35:43.957800 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:43 crc kubenswrapper[4792]: E0318 15:35:43.957900 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:35:45.957874105 +0000 UTC m=+94.827203132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.025060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.025123 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.025140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.025163 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.025180 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.128163 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.128218 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.128229 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.128244 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.128254 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.230346 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.230396 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.230407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.230428 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.230440 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.333372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.333431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.333449 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.333474 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.333490 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.436608 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.436806 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.436881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.436916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.436936 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.539126 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.539160 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.539169 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.539182 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.539190 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.641946 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.642057 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.642080 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.642105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.642126 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.747423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.747522 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.747550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.747581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.747604 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.854541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.854582 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.854594 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.854608 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.854620 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.957364 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.957437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.957457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.957483 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:44 crc kubenswrapper[4792]: I0318 15:35:44.957504 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:44Z","lastTransitionTime":"2026-03-18T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.060263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.060305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.060315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.060334 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.060354 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.162725 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.162755 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.162766 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.162778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.162788 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.265738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.265784 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.265796 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.265812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.265825 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.368066 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.368105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.368114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.368130 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.368140 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.471437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.471487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.471504 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.471530 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.471548 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.574704 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.574760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.574777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.574800 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.574816 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.577367 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.577540 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.577580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.577629 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577692 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:36:01.577653468 +0000 UTC m=+110.446982445 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577742 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577792 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577818 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577835 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577842 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577903 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:01.577864455 +0000 UTC m=+110.447193432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.577950 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:01.577923536 +0000 UTC m=+110.447252533 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.578023 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:01.578001808 +0000 UTC m=+110.447330875 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.678272 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.678472 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.678501 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.678521 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.678596 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:01.678574814 +0000 UTC m=+110.547903781 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.679897 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.679940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.679951 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.679998 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.680011 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.782474 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.782533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.782549 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.782571 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.782584 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.854071 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.854181 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.854204 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.854172 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.854329 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.854436 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.854575 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.854794 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.885316 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.885375 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.885392 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.885415 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.885432 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.982605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.982808 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: E0318 15:35:45.982981 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:35:49.982937582 +0000 UTC m=+98.852266539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.987793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.987827 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.987836 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.987853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:45 crc kubenswrapper[4792]: I0318 15:35:45.987865 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:45Z","lastTransitionTime":"2026-03-18T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.090476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.090532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.090545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.090559 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.090570 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.194148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.194206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.194223 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.194247 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.194265 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.296620 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.296656 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.296665 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.296679 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.296690 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.405491 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.405546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.405559 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.405574 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.405584 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.508506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.508565 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.508577 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.508594 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.508604 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.611163 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.611227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.611244 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.611272 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.611291 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.714795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.714854 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.714877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.714908 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.714931 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.818194 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.818266 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.818291 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.818322 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.818343 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.921245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.921317 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.921340 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.921372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:46 crc kubenswrapper[4792]: I0318 15:35:46.921396 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:46Z","lastTransitionTime":"2026-03-18T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.024574 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.024638 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.024656 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.024682 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.024700 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.127780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.127860 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.127883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.127917 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.127942 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.231271 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.231333 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.231346 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.231363 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.231375 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.334550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.334618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.334635 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.334669 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.334687 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.437243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.437687 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.437706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.437729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.437745 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.540388 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.540455 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.540468 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.540486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.540501 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.643154 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.643205 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.643215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.643235 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.643246 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.746182 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.746249 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.746265 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.746287 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.746305 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.849091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.849137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.849147 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.849164 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.849175 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.853621 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.853697 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.853743 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:47 crc kubenswrapper[4792]: E0318 15:35:47.853924 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.853946 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:47 crc kubenswrapper[4792]: E0318 15:35:47.854117 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:47 crc kubenswrapper[4792]: E0318 15:35:47.854195 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:47 crc kubenswrapper[4792]: E0318 15:35:47.854262 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.952164 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.952227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.952248 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.952277 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:47 crc kubenswrapper[4792]: I0318 15:35:47.952309 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:47Z","lastTransitionTime":"2026-03-18T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.055056 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.055124 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.055144 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.055166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.055178 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.157409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.157465 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.157480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.157495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.157505 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.260060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.260127 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.260142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.260173 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.260189 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.362235 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.362283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.362294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.362309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.362320 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.465857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.465916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.465930 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.465949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.465961 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.569361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.569410 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.569423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.569446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.569461 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.672332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.672404 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.672424 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.672450 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.672469 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.775036 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.775092 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.775106 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.775160 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.775176 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.877842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.877911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.877935 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.877964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.878040 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.981683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.981735 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.981749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.981765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:48 crc kubenswrapper[4792]: I0318 15:35:48.981777 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:48Z","lastTransitionTime":"2026-03-18T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.084224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.084279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.084295 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.084316 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.084335 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.186752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.186821 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.186845 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.186874 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.186897 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.290029 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.290075 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.290091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.290113 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.290129 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.392568 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.392619 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.392637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.392660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.392677 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.495957 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.496042 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.496060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.496083 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.496100 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.597829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.597860 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.597868 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.597880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.597889 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.700373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.700421 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.700437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.700460 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.700479 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.803473 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.803546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.803570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.803602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.803624 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.854120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.854242 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:49 crc kubenswrapper[4792]: E0318 15:35:49.854283 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.854331 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.854382 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:49 crc kubenswrapper[4792]: E0318 15:35:49.854423 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:49 crc kubenswrapper[4792]: E0318 15:35:49.854517 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:49 crc kubenswrapper[4792]: E0318 15:35:49.854641 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.907343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.907411 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.907435 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.907469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:49 crc kubenswrapper[4792]: I0318 15:35:49.907494 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:49Z","lastTransitionTime":"2026-03-18T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.010585 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.010648 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.010665 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.010694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.010712 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.026202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:50 crc kubenswrapper[4792]: E0318 15:35:50.026409 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:50 crc kubenswrapper[4792]: E0318 15:35:50.026535 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:35:58.026504833 +0000 UTC m=+106.895833800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.114006 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.114227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.114278 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.114312 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.114335 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.217263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.217332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.217364 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.217459 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.217548 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.321244 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.321341 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.321365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.321390 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.321409 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.424455 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.424514 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.424529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.424554 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.424570 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.527284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.527738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.528005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.528243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.528427 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.631400 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.631476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.631498 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.631523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.631543 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.733895 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.733940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.733952 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.733983 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.733997 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.837227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.837277 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.837293 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.837311 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.837324 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.939252 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.939309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.939324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.939346 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:50 crc kubenswrapper[4792]: I0318 15:35:50.939362 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:50Z","lastTransitionTime":"2026-03-18T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.042411 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.042466 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.042482 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.042506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.042523 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.145685 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.145755 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.145772 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.145795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.145813 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.249224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.249287 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.249309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.249335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.249352 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.352002 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.352060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.352076 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.352101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.352118 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.455349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.455417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.455436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.455460 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.455479 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.558930 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.559003 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.559017 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.559034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.559046 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.662437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.662483 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.662496 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.662514 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.662528 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.766219 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.766299 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.766317 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.766341 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.766358 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.853636 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.853686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:51 crc kubenswrapper[4792]: E0318 15:35:51.853899 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.853934 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:51 crc kubenswrapper[4792]: E0318 15:35:51.854164 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.854294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:51 crc kubenswrapper[4792]: E0318 15:35:51.854438 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:51 crc kubenswrapper[4792]: E0318 15:35:51.854600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.855953 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:35:51 crc kubenswrapper[4792]: E0318 15:35:51.857276 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.870786 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.871174 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.871417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.871602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.871783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.871928 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.886783 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.903241 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.917747 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.939645 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.955382 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.968076 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.975676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.975701 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.975712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.975727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.975737 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:51Z","lastTransitionTime":"2026-03-18T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.982581 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:51 crc kubenswrapper[4792]: I0318 15:35:51.995543 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.011284 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.031250 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.051508 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.078042 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.078470 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.078680 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.078879 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.079296 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.085091 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.101950 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.114953 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.129857 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.183516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.184114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.184155 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.184175 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.184188 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.286783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.286827 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.286836 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.286850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.286859 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.389033 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.389081 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.389091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.389104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.389113 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.491556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.491606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.491618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.491638 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.491650 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.560584 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.560632 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.560644 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.560661 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.560673 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: E0318 15:35:52.585749 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.590238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.590289 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.590301 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.590318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.590329 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: E0318 15:35:52.605405 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.609601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.609661 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.609679 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.609706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.609726 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: E0318 15:35:52.625792 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.630384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.630413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.630421 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.630434 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.630443 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: E0318 15:35:52.644568 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.648707 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.648731 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.648739 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.648757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.648773 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: E0318 15:35:52.666258 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:52 crc kubenswrapper[4792]: E0318 15:35:52.666382 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.668429 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.668452 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.668461 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.668474 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.668483 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.771559 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.771616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.771632 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.771651 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.771893 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.874782 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.874863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.874910 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.874934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.874948 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.977791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.977842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.977853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.977872 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:52 crc kubenswrapper[4792]: I0318 15:35:52.977883 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:52Z","lastTransitionTime":"2026-03-18T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.080906 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.080957 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.080996 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.081016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.081027 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.183946 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.184024 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.184040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.184165 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.184180 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.287749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.287801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.287811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.287828 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.287844 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.390486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.390538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.390549 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.390567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.390581 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.493484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.493541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.493555 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.493570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.493581 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.595964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.596029 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.596042 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.596058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.596070 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.698814 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.698846 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.698855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.698867 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.698878 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.801305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.801349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.801357 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.801372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.801382 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.854145 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.854213 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:53 crc kubenswrapper[4792]: E0318 15:35:53.854306 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:53 crc kubenswrapper[4792]: E0318 15:35:53.854438 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.854143 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:53 crc kubenswrapper[4792]: E0318 15:35:53.854551 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.854643 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:53 crc kubenswrapper[4792]: E0318 15:35:53.854752 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.904629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.904709 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.904729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.904772 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:53 crc kubenswrapper[4792]: I0318 15:35:53.904791 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:53Z","lastTransitionTime":"2026-03-18T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.008289 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.008355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.008365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.008396 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.008407 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.110497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.110566 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.110589 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.110618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.110640 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.214271 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.214313 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.214321 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.214335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.214343 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.320767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.320837 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.320861 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.320891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.320915 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.424732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.424802 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.424825 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.424854 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.424877 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.530301 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.530375 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.530389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.530407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.530418 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.633139 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.633206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.633229 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.633256 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.633276 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.739114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.739167 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.739185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.739211 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.739229 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.841695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.841737 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.841753 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.841777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.841793 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.874323 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.944150 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.944177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.944186 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.944198 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:54 crc kubenswrapper[4792]: I0318 15:35:54.944209 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:54Z","lastTransitionTime":"2026-03-18T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.046994 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.047022 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.047032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.047045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.047055 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.150034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.150100 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.150118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.150146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.150168 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.252858 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.252918 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.252940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.252965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.253010 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.355678 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.355757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.355780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.355811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.355849 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.459071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.459142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.459166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.459194 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.459215 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.562624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.562698 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.562811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.562911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.562936 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.666280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.666362 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.666418 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.666447 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.666468 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.770469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.770550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.770572 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.770604 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.770626 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.854024 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.854120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.854146 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.854061 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:55 crc kubenswrapper[4792]: E0318 15:35:55.854269 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:55 crc kubenswrapper[4792]: E0318 15:35:55.854435 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:55 crc kubenswrapper[4792]: E0318 15:35:55.854599 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:55 crc kubenswrapper[4792]: E0318 15:35:55.854876 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.873873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.873923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.873938 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.873962 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.874022 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.977461 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.977549 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.977574 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.977606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:55 crc kubenswrapper[4792]: I0318 15:35:55.977629 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:55Z","lastTransitionTime":"2026-03-18T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.081839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.081948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.082010 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.082059 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.082087 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.185235 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.185343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.185377 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.185420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.185447 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.288817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.288866 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.288877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.288897 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.288908 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.391919 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.392014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.392028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.392059 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.392077 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.494893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.494994 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.495014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.495042 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.495065 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.598287 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.598343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.598355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.598373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.598386 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.702939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.703054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.703079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.703112 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.703130 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.808222 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.808260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.808270 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.808287 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.808295 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.855266 4792 scope.go:117] "RemoveContainer" containerID="914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.910535 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.910595 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.910631 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.910654 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:56 crc kubenswrapper[4792]: I0318 15:35:56.910667 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:56Z","lastTransitionTime":"2026-03-18T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.013677 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.013733 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.013789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.013814 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.013832 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.116662 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.116714 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.116729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.116752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.116767 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.219894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.219963 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.219992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.220011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.220026 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.323914 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.323988 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.324005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.324031 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.324048 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.356837 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/1.log" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.360415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.360954 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.378376 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.394456 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.405381 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.420891 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.429095 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.429157 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.429172 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.429191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.429202 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.438801 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.452390 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.463807 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.478109 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.490536 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.503362 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.517165 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.529349 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.530935 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.531012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.531027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.531043 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.531054 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.548090 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.562028 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.573231 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.584620 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.603033 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:57Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.633687 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.633737 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.633749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.633764 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.633792 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.737025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.737067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.737076 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.737089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.737100 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.839813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.839867 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.839883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.839906 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.839926 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.853292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.853392 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:57 crc kubenswrapper[4792]: E0318 15:35:57.853498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.853605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.853620 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:57 crc kubenswrapper[4792]: E0318 15:35:57.853732 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:57 crc kubenswrapper[4792]: E0318 15:35:57.854712 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:57 crc kubenswrapper[4792]: E0318 15:35:57.854875 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.942642 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.942684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.942695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.942711 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:57 crc kubenswrapper[4792]: I0318 15:35:57.942721 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:57Z","lastTransitionTime":"2026-03-18T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.047340 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.047409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.047428 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.047453 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.047479 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.118756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:58 crc kubenswrapper[4792]: E0318 15:35:58.119026 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:58 crc kubenswrapper[4792]: E0318 15:35:58.119127 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:36:14.119105924 +0000 UTC m=+122.988434881 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.149940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.150002 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.150013 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.150030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.150044 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.252869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.252910 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.252922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.252939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.252950 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.354752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.354792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.354803 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.354818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.354827 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.367842 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/2.log" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.368579 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/1.log" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.371776 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac" exitCode=1 Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.371820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.371857 4792 scope.go:117] "RemoveContainer" containerID="914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.372429 4792 scope.go:117] "RemoveContainer" containerID="47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac" Mar 18 15:35:58 crc kubenswrapper[4792]: E0318 15:35:58.372564 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.394654 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.409367 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.429885 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.453760 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.457762 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.457812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.457827 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.457843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.457852 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.464924 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.479110 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.492868 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.504208 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.514243 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.522619 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.535516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.551796 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.560209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.560274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.560290 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.560312 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.560328 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.565768 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.577800 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.590627 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.608657 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914a3eb026d77a0eb29199f70861e51e054548a8e513d4f08b94ac3b5ea9975b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"message\\\":\\\"-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113446 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0318 15:35:41.113436 6734 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 15:35:41.113481 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 15:35:41.113558 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.629013 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:58Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.663471 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.663546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.663559 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.663582 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.663596 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.766710 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.766764 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.766782 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.766805 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.766821 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.869053 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.869099 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.869115 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.869145 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.869157 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.972480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.972542 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.972560 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.972584 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:58 crc kubenswrapper[4792]: I0318 15:35:58.972601 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:58Z","lastTransitionTime":"2026-03-18T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.075237 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.075401 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.075446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.075476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.075501 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.178874 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.178914 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.178943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.178959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.178984 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.282062 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.282121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.282139 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.282162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.282180 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.376605 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/2.log" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.380046 4792 scope.go:117] "RemoveContainer" containerID="47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac" Mar 18 15:35:59 crc kubenswrapper[4792]: E0318 15:35:59.380226 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.383832 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.383869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.383880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.383894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.383905 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.397598 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.417551 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.435233 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.458129 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.470572 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.486750 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.486822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.486841 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.486869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.486886 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.487719 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.500605 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.515376 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.532756 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.550235 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.565991 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.583716 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.589032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.589091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.589111 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.589137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.589155 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.603929 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.619157 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.634242 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.659807 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.691204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.691267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.691285 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.691306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.691318 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.698595 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:35:59Z is after 2025-08-24T17:21:41Z" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.794515 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.794573 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.794664 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.794706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.794724 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.853486 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.853615 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:35:59 crc kubenswrapper[4792]: E0318 15:35:59.853641 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.853709 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:35:59 crc kubenswrapper[4792]: E0318 15:35:59.853797 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.853860 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:35:59 crc kubenswrapper[4792]: E0318 15:35:59.854103 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:35:59 crc kubenswrapper[4792]: E0318 15:35:59.854261 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.898159 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.898232 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.898256 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.898284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:35:59 crc kubenswrapper[4792]: I0318 15:35:59.898307 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:35:59Z","lastTransitionTime":"2026-03-18T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.001953 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.002057 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.002083 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.002115 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.002136 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.105183 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.105216 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.105230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.105246 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.105258 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.208187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.208239 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.208282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.208300 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.208313 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.310863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.310915 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.310929 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.310945 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.310957 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.413346 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.413387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.413399 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.413415 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.413438 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.516439 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.516501 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.516520 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.516547 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.516564 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.620319 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.620389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.620405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.620433 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.620450 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.723123 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.723171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.723182 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.723203 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.723215 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.825278 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.825330 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.825344 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.825365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.825383 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.927604 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.927659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.927678 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.927702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:00 crc kubenswrapper[4792]: I0318 15:36:00.927718 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:00Z","lastTransitionTime":"2026-03-18T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.030775 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.030836 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.030854 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.030881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.030899 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.133448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.133485 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.133497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.133512 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.133523 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.237074 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.237133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.237152 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.237179 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.237197 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.340146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.340219 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.340239 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.340264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.340281 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.443560 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.443588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.443601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.443618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.443628 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.546486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.546542 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.546553 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.546567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.546577 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.648488 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.648564 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.648582 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.648607 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.648626 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.655062 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.655181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.655268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.655302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655337 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:36:33.655301536 +0000 UTC m=+142.524630533 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655456 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655490 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655526 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655545 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655560 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:33.655534035 +0000 UTC m=+142.524863022 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655462 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655640 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:33.655617338 +0000 UTC m=+142.524946305 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.655681 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:33.65566257 +0000 UTC m=+142.524991547 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.751240 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.751296 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.751313 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.751338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.751356 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.756837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.757086 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.757140 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.757166 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.757259 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:36:33.757231284 +0000 UTC m=+142.626560261 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853398 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853453 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853411 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853602 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.853558 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.853683 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853641 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853774 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.853800 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.853862 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: E0318 15:36:01.854111 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.875115 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.897596 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.915526 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.932182 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.950138 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.956144 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.956220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.956242 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.956296 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.956315 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:01Z","lastTransitionTime":"2026-03-18T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.969234 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.979527 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:01 crc kubenswrapper[4792]: I0318 15:36:01.991010 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:01Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.013773 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.028314 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.053048 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.058221 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.058256 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.058267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.058283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.058295 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.069082 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.080587 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.091769 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.102735 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.115868 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.126920 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.159959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.160010 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.160019 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.160031 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.160040 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.262561 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.262816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.262838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.262861 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.262882 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.366227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.366267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.366279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.366296 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.366308 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.469589 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.469643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.469661 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.469686 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.469703 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.572390 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.572448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.572466 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.572490 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.572508 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.675779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.675834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.675852 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.675875 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.675890 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.779965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.780075 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.780101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.780134 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.780154 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.815837 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.815890 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.815911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.815934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.815948 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: E0318 15:36:02.836383 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.840378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.840417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.840432 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.840459 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.840474 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: E0318 15:36:02.854691 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.859085 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.859134 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.859146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.859165 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.859178 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: E0318 15:36:02.877177 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.881034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.881091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.881105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.881120 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.881130 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: E0318 15:36:02.895136 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.898819 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.898851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.898862 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.898880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.898891 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:02 crc kubenswrapper[4792]: E0318 15:36:02.913017 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:02Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:02 crc kubenswrapper[4792]: E0318 15:36:02.913177 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.914936 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.915012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.915026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.915044 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:02 crc kubenswrapper[4792]: I0318 15:36:02.915058 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:02Z","lastTransitionTime":"2026-03-18T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.018253 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.018311 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.018320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.018335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.018345 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.121051 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.121126 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.121161 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.121191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.121213 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.224154 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.224210 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.224231 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.224253 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.224270 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.327900 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.327952 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.327996 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.328022 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.328040 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.430795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.430861 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.430885 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.430919 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.430943 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.533804 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.533866 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.533878 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.533893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.533903 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.636844 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.636920 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.636943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.637002 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.637026 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.739863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.739934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.739957 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.740045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.740071 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.842904 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.843025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.843053 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.843084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.843110 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.854274 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:03 crc kubenswrapper[4792]: E0318 15:36:03.854420 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.854839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:03 crc kubenswrapper[4792]: E0318 15:36:03.854908 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.854957 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:03 crc kubenswrapper[4792]: E0318 15:36:03.855048 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.855194 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:03 crc kubenswrapper[4792]: E0318 15:36:03.855265 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.946381 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.946691 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.946855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.947001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:03 crc kubenswrapper[4792]: I0318 15:36:03.947141 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:03Z","lastTransitionTime":"2026-03-18T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.054711 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.055099 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.055238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.055375 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.055496 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.158754 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.159132 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.159274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.159427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.159557 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.261922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.262046 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.262065 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.262089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.262107 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.365322 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.365394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.365418 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.365447 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.365471 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.468666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.468757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.468777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.468835 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.468856 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.571636 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.571712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.571735 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.571767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.571790 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.675036 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.675099 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.675121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.675150 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.675172 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.777492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.777540 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.777557 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.777581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.777598 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.880567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.880594 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.880603 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.880618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.880627 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.983508 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.983536 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.983544 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.983556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:04 crc kubenswrapper[4792]: I0318 15:36:04.983567 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:04Z","lastTransitionTime":"2026-03-18T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.087097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.087149 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.087166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.087192 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.087209 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.190405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.190479 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.190503 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.190530 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.190551 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.294387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.294451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.294472 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.294505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.294526 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.397125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.397162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.397172 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.397188 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.397198 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.499246 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.499317 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.499334 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.499361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.499379 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.602502 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.602564 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.602589 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.602616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.602637 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.705279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.705345 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.705366 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.705394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.705423 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.808060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.808136 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.808157 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.808185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.808205 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.854067 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.854158 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:05 crc kubenswrapper[4792]: E0318 15:36:05.854255 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.854319 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:05 crc kubenswrapper[4792]: E0318 15:36:05.854435 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.854473 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:05 crc kubenswrapper[4792]: E0318 15:36:05.854859 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:05 crc kubenswrapper[4792]: E0318 15:36:05.855050 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.855161 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:36:05 crc kubenswrapper[4792]: E0318 15:36:05.855414 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.911259 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.911311 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.911328 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.911350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:05 crc kubenswrapper[4792]: I0318 15:36:05.911368 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:05Z","lastTransitionTime":"2026-03-18T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.013317 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.013355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.013366 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.013381 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.013391 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.115505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.115590 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.115612 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.115640 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.115661 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.219055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.219118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.219135 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.219158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.219175 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.322233 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.322329 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.322355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.322385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.322408 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.424390 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.424417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.424426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.424440 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.424452 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.527219 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.527259 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.527269 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.527284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.527295 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.629536 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.629600 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.629622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.629643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.629656 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.732494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.732558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.732582 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.732616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.732641 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.835146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.835199 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.835209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.835225 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.835240 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.937358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.937396 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.937406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.937421 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:06 crc kubenswrapper[4792]: I0318 15:36:06.937432 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:06Z","lastTransitionTime":"2026-03-18T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.039286 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.039318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.039329 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.039345 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.039356 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.142233 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.142320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.142343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.142373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.142399 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.245656 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.245745 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.245779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.245808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.245829 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.348323 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.348361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.348379 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.348395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.348405 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.452109 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.452185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.452209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.452243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.452266 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.555492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.555555 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.555569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.555593 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.555608 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.658894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.658958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.659005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.659031 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.659048 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.761647 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.761706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.761727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.761752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.761768 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.854305 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.854385 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.854411 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.854417 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:07 crc kubenswrapper[4792]: E0318 15:36:07.854483 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:07 crc kubenswrapper[4792]: E0318 15:36:07.854632 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:07 crc kubenswrapper[4792]: E0318 15:36:07.854686 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:07 crc kubenswrapper[4792]: E0318 15:36:07.854746 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.863655 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.863722 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.863742 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.863769 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.863788 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.967320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.967360 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.967368 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.967385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:07 crc kubenswrapper[4792]: I0318 15:36:07.967394 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:07Z","lastTransitionTime":"2026-03-18T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.071254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.071324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.071343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.071372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.071390 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.174730 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.174806 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.174829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.174857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.174879 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.277195 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.277246 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.277264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.277288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.277301 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.380212 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.380261 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.380279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.380301 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.380318 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.483374 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.483439 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.483460 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.483483 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.483502 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.586944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.587030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.587045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.587060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.587071 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.690941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.691041 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.691057 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.691077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.691091 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.794212 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.794283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.794312 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.794343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.794367 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.867143 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.896879 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.896925 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.896938 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.896954 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:08 crc kubenswrapper[4792]: I0318 15:36:08.896983 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:08Z","lastTransitionTime":"2026-03-18T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.000645 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.001079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.001097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.001120 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.001145 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.104030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.104091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.104103 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.104120 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.104163 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.207264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.207315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.207327 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.207349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.207359 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.311066 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.311123 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.311139 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.311161 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.311177 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.414956 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.415079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.415096 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.415120 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.415137 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.517696 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.517748 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.517765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.517781 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.517791 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.620210 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.620288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.620308 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.620337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.620357 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.724111 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.724171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.724187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.724206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.724225 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.827676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.827732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.827743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.827768 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.827779 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.853586 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.853749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:09 crc kubenswrapper[4792]: E0318 15:36:09.853916 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.853944 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.853988 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:09 crc kubenswrapper[4792]: E0318 15:36:09.854136 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:09 crc kubenswrapper[4792]: E0318 15:36:09.854291 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:09 crc kubenswrapper[4792]: E0318 15:36:09.854542 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.930569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.930613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.930667 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.930695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:09 crc kubenswrapper[4792]: I0318 15:36:09.930793 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:09Z","lastTransitionTime":"2026-03-18T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.034574 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.034634 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.034652 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.034679 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.034705 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.136881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.136933 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.136950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.137008 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.137026 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.239657 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.239697 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.239706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.239721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.239729 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.341698 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.341737 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.341746 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.341797 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.341808 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.444863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.444923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.444941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.444992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.445011 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.547058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.547124 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.547142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.547166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.547184 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.649708 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.649789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.649812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.649845 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.649870 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.751996 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.752035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.752043 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.752058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.752067 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.854232 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.854268 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.854276 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.854289 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.854297 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.956659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.956713 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.956733 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.956761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:10 crc kubenswrapper[4792]: I0318 15:36:10.956779 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:10Z","lastTransitionTime":"2026-03-18T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.060254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.060337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.060362 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.060394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.060418 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.162949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.163034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.163060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.163089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.163110 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.266011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.266092 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.266117 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.266149 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.266173 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.368827 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.368936 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.368961 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.369045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.369070 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.472394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.472453 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.472471 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.472495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.472514 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.575876 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.575924 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.575960 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.576038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.576063 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.679205 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.679269 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.679295 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.679324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.679347 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.782724 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.783659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.783875 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.784118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.784247 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:11Z","lastTransitionTime":"2026-03-18T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.853420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.853420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:11 crc kubenswrapper[4792]: E0318 15:36:11.854288 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.853512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.853489 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:11 crc kubenswrapper[4792]: E0318 15:36:11.854476 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:11 crc kubenswrapper[4792]: E0318 15:36:11.854536 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:11 crc kubenswrapper[4792]: E0318 15:36:11.854590 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.876849 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:11 crc kubenswrapper[4792]: E0318 15:36:11.885311 4792 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.896159 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.911157 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.927612 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:11 crc kubenswrapper[4792]: E0318 15:36:11.954764 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.961050 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:11 crc kubenswrapper[4792]: I0318 15:36:11.984200 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:11Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.006591 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.028938 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.044775 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.057595 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.066825 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.076118 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.087165 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.098813 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.111291 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.131258 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.141358 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.157392 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.855129 4792 scope.go:117] "RemoveContainer" containerID="47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac" Mar 18 15:36:12 crc kubenswrapper[4792]: E0318 15:36:12.855369 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.939344 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.939406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.939427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.939451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.939470 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:12Z","lastTransitionTime":"2026-03-18T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:12 crc kubenswrapper[4792]: E0318 15:36:12.959222 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.964025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.964074 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.964089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.964113 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.964130 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:12Z","lastTransitionTime":"2026-03-18T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:12 crc kubenswrapper[4792]: E0318 15:36:12.978192 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.981860 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.981905 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.981922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.981941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.981958 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:12Z","lastTransitionTime":"2026-03-18T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:12 crc kubenswrapper[4792]: E0318 15:36:12.995629 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:12Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.999202 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.999255 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.999268 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.999282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:12 crc kubenswrapper[4792]: I0318 15:36:12.999294 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:12Z","lastTransitionTime":"2026-03-18T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:13 crc kubenswrapper[4792]: E0318 15:36:13.012055 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:13Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.016620 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.016671 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.016683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.016702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.016712 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:13Z","lastTransitionTime":"2026-03-18T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:13 crc kubenswrapper[4792]: E0318 15:36:13.029380 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:13Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:13 crc kubenswrapper[4792]: E0318 15:36:13.029536 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.854320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.854380 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.854330 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:13 crc kubenswrapper[4792]: E0318 15:36:13.854455 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:13 crc kubenswrapper[4792]: I0318 15:36:13.854491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:13 crc kubenswrapper[4792]: E0318 15:36:13.854645 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:13 crc kubenswrapper[4792]: E0318 15:36:13.855095 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:13 crc kubenswrapper[4792]: E0318 15:36:13.855294 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:14 crc kubenswrapper[4792]: I0318 15:36:14.187350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:14 crc kubenswrapper[4792]: E0318 15:36:14.187605 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:36:14 crc kubenswrapper[4792]: E0318 15:36:14.187740 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:36:46.187706185 +0000 UTC m=+155.057035202 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:36:15 crc kubenswrapper[4792]: I0318 15:36:15.853520 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:15 crc kubenswrapper[4792]: E0318 15:36:15.853839 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:15 crc kubenswrapper[4792]: I0318 15:36:15.853514 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:15 crc kubenswrapper[4792]: E0318 15:36:15.854026 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:15 crc kubenswrapper[4792]: I0318 15:36:15.853665 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:15 crc kubenswrapper[4792]: E0318 15:36:15.854098 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:15 crc kubenswrapper[4792]: I0318 15:36:15.853577 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:15 crc kubenswrapper[4792]: E0318 15:36:15.854172 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:16 crc kubenswrapper[4792]: E0318 15:36:16.956781 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.446271 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/0.log" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.446321 4792 generic.go:334] "Generic (PLEG): container finished" podID="241b9e3f-bd41-4fb2-a68a-9395a67feaae" containerID="ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2" exitCode=1 Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.446364 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerDied","Data":"ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2"} Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.446723 4792 scope.go:117] "RemoveContainer" containerID="ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.478645 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.500932 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.519732 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.531634 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.544689 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.561702 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.578686 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.595744 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.611500 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.624720 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.638137 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.648307 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.660863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.671266 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.682380 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.701133 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.715224 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.729895 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:17Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.853516 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.853552 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:17 crc kubenswrapper[4792]: E0318 15:36:17.853628 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.853674 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:17 crc kubenswrapper[4792]: E0318 15:36:17.853771 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:17 crc kubenswrapper[4792]: I0318 15:36:17.853861 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:17 crc kubenswrapper[4792]: E0318 15:36:17.854035 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:17 crc kubenswrapper[4792]: E0318 15:36:17.854245 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.450324 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/0.log" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.450989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerStarted","Data":"b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914"} Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.461904 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.479215 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.494229 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.508869 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.520687 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.534141 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.548675 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.563590 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.575994 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.590130 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.612128 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.634044 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.649456 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.662323 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.676671 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.692802 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.702765 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:18 crc kubenswrapper[4792]: I0318 15:36:18.711339 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:18Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:19 crc kubenswrapper[4792]: I0318 15:36:19.853613 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:19 crc kubenswrapper[4792]: I0318 15:36:19.853713 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:19 crc kubenswrapper[4792]: E0318 15:36:19.853837 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:19 crc kubenswrapper[4792]: I0318 15:36:19.853881 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:19 crc kubenswrapper[4792]: I0318 15:36:19.853883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:19 crc kubenswrapper[4792]: I0318 15:36:19.854416 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:36:19 crc kubenswrapper[4792]: E0318 15:36:19.854589 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:19 crc kubenswrapper[4792]: E0318 15:36:19.854854 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:19 crc kubenswrapper[4792]: E0318 15:36:19.855060 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.458503 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.461072 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2"} Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.461346 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.482766 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.499139 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.518591 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.534119 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.549327 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.570141 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.583742 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.603396 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.616737 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.629463 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.643304 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.653394 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.664896 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.673803 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.685030 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.703041 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.717861 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:20 crc kubenswrapper[4792]: I0318 15:36:20.728174 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:20Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.853337 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.853375 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.853375 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:21 crc kubenswrapper[4792]: E0318 15:36:21.853577 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:21 crc kubenswrapper[4792]: E0318 15:36:21.853718 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:21 crc kubenswrapper[4792]: E0318 15:36:21.853815 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.854104 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:21 crc kubenswrapper[4792]: E0318 15:36:21.854279 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.877332 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.896385 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.911544 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.926082 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.940710 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.952272 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: E0318 15:36:21.958278 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.968233 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.981291 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:21 crc kubenswrapper[4792]: I0318 15:36:21.991838 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:21Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.004708 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.020211 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.048677 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.076698 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.094179 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.115524 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.127594 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.138275 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:22 crc kubenswrapper[4792]: I0318 15:36:22.156009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:22Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.078629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.078673 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.078684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.078700 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.078712 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:23Z","lastTransitionTime":"2026-03-18T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.102288 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:23Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.106384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.106432 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.106447 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.106470 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.106486 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:23Z","lastTransitionTime":"2026-03-18T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.123426 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:23Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.127990 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.128063 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.128079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.128097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.128110 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:23Z","lastTransitionTime":"2026-03-18T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.142667 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:23Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.146577 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.146607 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.146620 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.146637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.146649 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:23Z","lastTransitionTime":"2026-03-18T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.159611 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:23Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.163566 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.163627 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.163645 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.163669 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.163684 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:23Z","lastTransitionTime":"2026-03-18T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.177964 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:23Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.178115 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.853645 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.853666 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.853676 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.853874 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.853699 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.853999 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.854111 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:23 crc kubenswrapper[4792]: E0318 15:36:23.854228 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:23 crc kubenswrapper[4792]: I0318 15:36:23.867894 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 15:36:25 crc kubenswrapper[4792]: I0318 15:36:25.853514 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:25 crc kubenswrapper[4792]: I0318 15:36:25.853596 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:25 crc kubenswrapper[4792]: I0318 15:36:25.853660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:25 crc kubenswrapper[4792]: E0318 15:36:25.853707 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:25 crc kubenswrapper[4792]: I0318 15:36:25.853807 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:25 crc kubenswrapper[4792]: E0318 15:36:25.853828 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:25 crc kubenswrapper[4792]: E0318 15:36:25.854063 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:25 crc kubenswrapper[4792]: E0318 15:36:25.854192 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:25 crc kubenswrapper[4792]: I0318 15:36:25.855161 4792 scope.go:117] "RemoveContainer" containerID="47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.486658 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/2.log" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.491963 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec"} Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.492543 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.516181 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.536322 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.548494 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.564463 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.588465 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.619267 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.636826 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.655949 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.678782 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.701113 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.715220 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.728099 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.744552 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a74fde-24f5-468b-91c3-376086a65984\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.763738 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.784117 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.800682 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.817403 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.837181 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: I0318 15:36:26.848315 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:26Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:26 crc kubenswrapper[4792]: E0318 15:36:26.959147 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.497830 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/3.log" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.498941 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/2.log" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.502235 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" exitCode=1 Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.502287 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec"} Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.502329 4792 scope.go:117] "RemoveContainer" containerID="47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.503268 4792 scope.go:117] "RemoveContainer" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" Mar 18 15:36:27 crc kubenswrapper[4792]: E0318 15:36:27.503484 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.521081 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a74fde-24f5-468b-91c3-376086a65984\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.540107 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.557076 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.576459 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.590481 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.611323 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.629229 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.650307 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.670350 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.683308 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.698775 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.722223 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47bf462e20ec99f92ceb3c715a18de3c479840e743bb15f34bd8a48a92820cac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:35:57Z\\\",\\\"message\\\":\\\"n-kubernetes/ovnkube-node-4pndk openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-machine-config-operator/machine-config-daemon-2wtm6 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-dns/node-resolver-tbjvb openshift-etcd/etcd-crc]\\\\nI0318 15:35:57.748863 6970 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0318 15:35:57.748883 6970 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF0318 15:35:57.748890 6970 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 6 for removal\\\\nI0318 15:36:26.869533 7327 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869536 7327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:36:26.869572 7327 factory.go:656] Stopping watch factory\\\\nI0318 15:36:26.869597 7327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:36:26.869206 7327 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869907 7327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:36:26.869927 7327 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:36:26.869930 7327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:36:26.869188 7327 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:36:26.870074 7327 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.870039 7327 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.750098 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.767447 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.787639 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.814866 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.836240 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.852030 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.854230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.854305 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.854317 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.854394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:27 crc kubenswrapper[4792]: E0318 15:36:27.854564 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:27 crc kubenswrapper[4792]: E0318 15:36:27.854633 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:27 crc kubenswrapper[4792]: E0318 15:36:27.854685 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:27 crc kubenswrapper[4792]: E0318 15:36:27.854715 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:27 crc kubenswrapper[4792]: I0318 15:36:27.868792 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:27Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.509218 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/3.log" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.514283 4792 scope.go:117] "RemoveContainer" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" Mar 18 15:36:28 crc kubenswrapper[4792]: E0318 15:36:28.514470 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.540447 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.556886 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.567431 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.586964 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.601511 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.610886 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.620219 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.629778 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.640007 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a74fde-24f5-468b-91c3-376086a65984\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.649835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.674487 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.692812 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.705448 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.727495 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.743757 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.757605 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.767008 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.777140 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:28 crc kubenswrapper[4792]: I0318 15:36:28.793671 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 6 for removal\\\\nI0318 15:36:26.869533 7327 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869536 7327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:36:26.869572 7327 factory.go:656] Stopping watch factory\\\\nI0318 15:36:26.869597 7327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:36:26.869206 7327 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869907 7327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:36:26.869927 7327 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:36:26.869930 7327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:36:26.869188 7327 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:36:26.870074 7327 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.870039 7327 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:28Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:29 crc kubenswrapper[4792]: I0318 15:36:29.853967 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:29 crc kubenswrapper[4792]: I0318 15:36:29.854097 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:29 crc kubenswrapper[4792]: I0318 15:36:29.854153 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:29 crc kubenswrapper[4792]: E0318 15:36:29.854263 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:29 crc kubenswrapper[4792]: I0318 15:36:29.854303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:29 crc kubenswrapper[4792]: E0318 15:36:29.854460 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:29 crc kubenswrapper[4792]: E0318 15:36:29.854643 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:29 crc kubenswrapper[4792]: E0318 15:36:29.854754 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.853579 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.853693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:31 crc kubenswrapper[4792]: E0318 15:36:31.853857 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.853909 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.854006 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:31 crc kubenswrapper[4792]: E0318 15:36:31.854206 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:31 crc kubenswrapper[4792]: E0318 15:36:31.854425 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:31 crc kubenswrapper[4792]: E0318 15:36:31.854544 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.884484 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.901873 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.916353 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.928030 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.943557 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.952915 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:31 crc kubenswrapper[4792]: E0318 15:36:31.959950 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.975355 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:31 crc kubenswrapper[4792]: I0318 15:36:31.994897 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:31Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.008512 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.021910 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.037858 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.053857 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a74fde-24f5-468b-91c3-376086a65984\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.068380 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.088357 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.105937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.126288 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 6 for removal\\\\nI0318 15:36:26.869533 7327 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869536 7327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:36:26.869572 7327 factory.go:656] Stopping watch factory\\\\nI0318 15:36:26.869597 7327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:36:26.869206 7327 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869907 7327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:36:26.869927 7327 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:36:26.869930 7327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:36:26.869188 7327 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:36:26.870074 7327 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.870039 7327 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.142774 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.156316 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:32 crc kubenswrapper[4792]: I0318 15:36:32.170180 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:32Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.325861 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.325927 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.325950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.326016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.326042 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:33Z","lastTransitionTime":"2026-03-18T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.345752 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.349710 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.349743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.349752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.349768 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.349777 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:33Z","lastTransitionTime":"2026-03-18T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.361199 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.371572 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.371622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.371633 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.371649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.372046 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:33Z","lastTransitionTime":"2026-03-18T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.390328 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.394785 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.394849 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.394868 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.394940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.395040 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:33Z","lastTransitionTime":"2026-03-18T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.412395 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.417111 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.417159 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.417172 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.417191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.417202 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:33Z","lastTransitionTime":"2026-03-18T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.434810 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:33Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.435216 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.737180 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.737332 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:37.737310396 +0000 UTC m=+206.606639343 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.737769 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.737884 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.737951 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:37.737934189 +0000 UTC m=+206.607263136 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.738235 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.738371 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.738441 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:37.738426337 +0000 UTC m=+206.607755284 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.738674 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.738792 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.738808 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.738821 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.738859 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:37.738849883 +0000 UTC m=+206.608178830 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.840269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.840458 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.840494 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.840515 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.840595 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:37.840572683 +0000 UTC m=+206.709901660 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.854311 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.854378 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.854529 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.854599 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.854758 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.854823 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:33 crc kubenswrapper[4792]: I0318 15:36:33.854929 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:33 crc kubenswrapper[4792]: E0318 15:36:33.855071 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:35 crc kubenswrapper[4792]: I0318 15:36:35.853646 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:35 crc kubenswrapper[4792]: I0318 15:36:35.853920 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:35 crc kubenswrapper[4792]: I0318 15:36:35.854024 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:35 crc kubenswrapper[4792]: E0318 15:36:35.854106 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:35 crc kubenswrapper[4792]: I0318 15:36:35.853941 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:35 crc kubenswrapper[4792]: E0318 15:36:35.854240 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:35 crc kubenswrapper[4792]: E0318 15:36:35.854370 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:35 crc kubenswrapper[4792]: E0318 15:36:35.853865 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:36 crc kubenswrapper[4792]: E0318 15:36:36.962207 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:37 crc kubenswrapper[4792]: I0318 15:36:37.853613 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:37 crc kubenswrapper[4792]: I0318 15:36:37.853642 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:37 crc kubenswrapper[4792]: I0318 15:36:37.853668 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:37 crc kubenswrapper[4792]: I0318 15:36:37.853786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:37 crc kubenswrapper[4792]: E0318 15:36:37.853920 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:37 crc kubenswrapper[4792]: E0318 15:36:37.854232 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:37 crc kubenswrapper[4792]: E0318 15:36:37.854306 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:37 crc kubenswrapper[4792]: E0318 15:36:37.854369 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.048589 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.068902 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a74fde-24f5-468b-91c3-376086a65984\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.085006 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.104385 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.124678 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.140549 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.159265 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.173662 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.198202 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.222619 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.239837 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.257262 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.281941 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 6 for removal\\\\nI0318 15:36:26.869533 7327 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869536 7327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:36:26.869572 7327 factory.go:656] Stopping watch factory\\\\nI0318 15:36:26.869597 7327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:36:26.869206 7327 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869907 7327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:36:26.869927 7327 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:36:26.869930 7327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:36:26.869188 7327 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:36:26.870074 7327 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.870039 7327 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.305870 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.323243 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.338398 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.363712 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.386192 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.400830 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:38 crc kubenswrapper[4792]: I0318 15:36:38.414190 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:38Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:39 crc kubenswrapper[4792]: I0318 15:36:39.854014 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:39 crc kubenswrapper[4792]: I0318 15:36:39.854100 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:39 crc kubenswrapper[4792]: I0318 15:36:39.854124 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:39 crc kubenswrapper[4792]: I0318 15:36:39.854153 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:39 crc kubenswrapper[4792]: E0318 15:36:39.854762 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:39 crc kubenswrapper[4792]: E0318 15:36:39.855016 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:39 crc kubenswrapper[4792]: E0318 15:36:39.855117 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:39 crc kubenswrapper[4792]: E0318 15:36:39.855212 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.853647 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:41 crc kubenswrapper[4792]: E0318 15:36:41.853805 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.853883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:41 crc kubenswrapper[4792]: E0318 15:36:41.854093 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.854205 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:41 crc kubenswrapper[4792]: E0318 15:36:41.854322 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.854346 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.855474 4792 scope.go:117] "RemoveContainer" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" Mar 18 15:36:41 crc kubenswrapper[4792]: E0318 15:36:41.855907 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:36:41 crc kubenswrapper[4792]: E0318 15:36:41.856244 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.875098 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.889482 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.908809 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.926099 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.946247 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a74fde-24f5-468b-91c3-376086a65984\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:41 crc kubenswrapper[4792]: E0318 15:36:41.963438 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.968824 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:41 crc kubenswrapper[4792]: I0318 15:36:41.990596 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:41Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.008043 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.030576 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 6 for removal\\\\nI0318 15:36:26.869533 7327 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869536 7327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:36:26.869572 7327 factory.go:656] Stopping watch factory\\\\nI0318 15:36:26.869597 7327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:36:26.869206 7327 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869907 7327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:36:26.869927 7327 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:36:26.869930 7327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:36:26.869188 7327 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:36:26.870074 7327 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.870039 7327 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.047672 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.065529 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.078211 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.109477 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.128159 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.139237 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.152491 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.166403 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.181386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:42 crc kubenswrapper[4792]: I0318 15:36:42.204346 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:42Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.667071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.667121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.667138 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.667157 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.667167 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:43Z","lastTransitionTime":"2026-03-18T15:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.687912 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.693108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.693153 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.693164 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.693185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.693198 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:43Z","lastTransitionTime":"2026-03-18T15:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.710702 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.715403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.715435 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.715445 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.715461 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.715471 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:43Z","lastTransitionTime":"2026-03-18T15:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.734255 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.738857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.738914 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.738930 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.738954 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.738993 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:43Z","lastTransitionTime":"2026-03-18T15:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.754869 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.759297 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.759361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.759379 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.759403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.759421 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:43Z","lastTransitionTime":"2026-03-18T15:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.774011 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:43Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.774127 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.853775 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.853850 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.853860 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.854006 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:43 crc kubenswrapper[4792]: I0318 15:36:43.854055 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.854226 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.854312 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:43 crc kubenswrapper[4792]: E0318 15:36:43.854398 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:45 crc kubenswrapper[4792]: I0318 15:36:45.854195 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:45 crc kubenswrapper[4792]: E0318 15:36:45.854678 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:45 crc kubenswrapper[4792]: I0318 15:36:45.855065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:45 crc kubenswrapper[4792]: E0318 15:36:45.855214 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:45 crc kubenswrapper[4792]: I0318 15:36:45.855254 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:45 crc kubenswrapper[4792]: I0318 15:36:45.855384 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:45 crc kubenswrapper[4792]: E0318 15:36:45.855380 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:45 crc kubenswrapper[4792]: E0318 15:36:45.855459 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:46 crc kubenswrapper[4792]: I0318 15:36:46.274803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:46 crc kubenswrapper[4792]: E0318 15:36:46.275092 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:36:46 crc kubenswrapper[4792]: E0318 15:36:46.275220 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs podName:f6d7b0a3-b8fe-49f9-91ad-ae46796becbc nodeName:}" failed. No retries permitted until 2026-03-18 15:37:50.275188237 +0000 UTC m=+219.144517204 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs") pod "network-metrics-daemon-rpvb6" (UID: "f6d7b0a3-b8fe-49f9-91ad-ae46796becbc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 15:36:46 crc kubenswrapper[4792]: E0318 15:36:46.965191 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:47 crc kubenswrapper[4792]: I0318 15:36:47.853868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:47 crc kubenswrapper[4792]: I0318 15:36:47.853997 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:47 crc kubenswrapper[4792]: I0318 15:36:47.854079 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:47 crc kubenswrapper[4792]: I0318 15:36:47.854587 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:47 crc kubenswrapper[4792]: E0318 15:36:47.854801 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:47 crc kubenswrapper[4792]: E0318 15:36:47.855115 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:47 crc kubenswrapper[4792]: E0318 15:36:47.855337 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:47 crc kubenswrapper[4792]: E0318 15:36:47.855602 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:49 crc kubenswrapper[4792]: I0318 15:36:49.854091 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:49 crc kubenswrapper[4792]: I0318 15:36:49.854152 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:49 crc kubenswrapper[4792]: I0318 15:36:49.854163 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:49 crc kubenswrapper[4792]: I0318 15:36:49.854105 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:49 crc kubenswrapper[4792]: E0318 15:36:49.854344 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:49 crc kubenswrapper[4792]: E0318 15:36:49.854463 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:49 crc kubenswrapper[4792]: E0318 15:36:49.854596 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:49 crc kubenswrapper[4792]: E0318 15:36:49.854750 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.853844 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.853890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.853910 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.853867 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:51 crc kubenswrapper[4792]: E0318 15:36:51.854009 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:51 crc kubenswrapper[4792]: E0318 15:36:51.854089 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:51 crc kubenswrapper[4792]: E0318 15:36:51.854198 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:51 crc kubenswrapper[4792]: E0318 15:36:51.854288 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.877504 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e46cf24-bfda-4bce-86e4-42a4b755b41f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:35:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 15:35:25.638306 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 15:35:25.638498 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 15:35:25.639856 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3774181797/tls.crt::/tmp/serving-cert-3774181797/tls.key\\\\\\\"\\\\nI0318 15:35:26.076487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 15:35:26.079716 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 15:35:26.079732 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 15:35:26.079748 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 15:35:26.079753 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 15:35:26.084435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0318 15:35:26.084447 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0318 15:35:26.084493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084506 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 15:35:26.084519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 15:35:26.084528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 15:35:26.084535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 15:35:26.084543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0318 15:35:26.086585 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.899895 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002d909b-8063-4034-be67-3a00201570ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e06d1294eea8bfad5a674d363ee2676bb426faee014989df2613ed1d4715dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051f0f53bab85b351771422c6ef8f436f96bf4f91df3cc6dc6dbbd00a1dfe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T15:34:42Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 15:34:13.942583 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 15:34:13.945151 1 observer_polling.go:159] Starting file observer\\\\nI0318 15:34:13.975322 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 15:34:13.979629 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 15:34:42.796317 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 15:34:42.796448 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:34:42Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc807d7fe3c34722bcd925f5a0ce95c1187b2cf85b28e5b509536bd634de1919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58af27c1a3dec250d5f59c5a2776db72d722162f2bac17e0c1f10c1e3fda37ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.913499 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc248a47-9621-46e6-9a90-8848bb6c57af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f2c30d13736efa53d912b6ab9bdd66021ffeaa5a084ea2cc4afa2f67f3c664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcced67166886621188055ce71fa1ab71a832bba5eafd2ec53f10244613f7538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.927342 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f667809a001956826c11d7c8a8f2aca29347c369283880f5dd80625ae8c096b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.957881 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e6f23ed-13da-466a-8c55-1043d6e0b748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:26Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 6 for removal\\\\nI0318 15:36:26.869533 7327 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869536 7327 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 15:36:26.869572 7327 factory.go:656] Stopping watch factory\\\\nI0318 15:36:26.869597 7327 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 15:36:26.869206 7327 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.869907 7327 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 15:36:26.869927 7327 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 15:36:26.869930 7327 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 15:36:26.869188 7327 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 15:36:26.870074 7327 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 15:36:26.870039 7327 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:36:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mqswl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4pndk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:51 crc kubenswrapper[4792]: E0318 15:36:51.966034 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:51 crc kubenswrapper[4792]: I0318 15:36:51.992901 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e805ab-1420-44aa-b92f-631be742e82a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0230da852ce85376abb0d697ecadf2c61401f7a3e8fc48c1ad32046b6aa70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7442d9ca63f8888fda3f0b46542ed100aff45f85f88f730e48912ac2a66501bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16c12ba800edc7821f6f4834b80abfafa1e567a3ede03a1e9d143e9b1c752e09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b730bb9f33b64a5054d5c5d02a80b4b4c534e9c3b462d475db400ded0241d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd03f667bddec2b42af99c3d9d0fb650d76ffee1050a4952bd6628bb39ea4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e89fd5eeca5550fb61a90660bcfb2a5e18b17eeb46b02ce745114e84b441610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401b021815868ac40bd8585a8cceaf56c7820394cc9350dae0dd1cf4dc04cccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3940098e89c681e572bdf3a1701743a6d97f9418be91544a254de59be7bc872d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:51Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.006793 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-snwl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpvb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.019118 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.036359 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51cef14-7d91-4e08-8045-831f7a9a65f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e0a395d4bb76e6fd8e0bfbfc4c2176a5acfe186312d87a4348fe6592ee8d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wtm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.057672 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnps4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c61302f-31a0-4ba3-99b0-e5206c848cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e23ff3ec4ac11ea3ae7da38a6ccda9474b56b12e9c94e0d897df5d9a96ac041\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f8334926668042597fea5b1b27add2abffec04190d863e4045cd6f8a3d12c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff83580918ea2cdd502341ce912d9341b3a58776a7f03a71c0cb4424e31ab925\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f87fda9553ad81d637417babd60cb05eadf3a7bd66b6919694740e38b88c3cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bebd7a9c48640ac4e2565ae43752614e781685d0dca20255f98a90783b65f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55dc1c0a54ca96fb9e6a14423a066c7d221743a54bb928afaa3bade8e49f0adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76812e973ce99656160d45b873704d13b9ef5dc004b3457f0efeabd575c17752\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s44sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnps4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.077626 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fqr6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"241b9e3f-bd41-4fb2-a68a-9395a67feaae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T15:36:17Z\\\",\\\"message\\\":\\\"2026-03-18T15:35:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da\\\\n2026-03-18T15:35:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9fbec11c-51f7-4d77-bc41-24439dbff7da to /host/opt/cni/bin/\\\\n2026-03-18T15:35:32Z [verbose] multus-daemon started\\\\n2026-03-18T15:35:32Z [verbose] Readiness Indicator file check\\\\n2026-03-18T15:36:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:36:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mkdtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fqr6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.091961 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-772vs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d0b1b6-4001-4571-8af7-57a361c58c49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b587091652fadc8c66795283aa66477a991f39d095d3189a3ee6c0fc8982615a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbctn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-772vs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.105500 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab07a75ca0ec3fa4edd9e6b0775db022aa7c32a88d9fc05ebd4619878405acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1615e28351a001e35df98679286c8f41fe6f1f5fbbc0f363082694448b9a2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.117045 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5263fba0-5316-48e0-a254-a4c598e30f02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101a3ab5aa6b386b987c20ec4d5319209a31c2a47d33e5b53655a1d310bcf515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e72a286301ed19f52214c3cec88987f1f197d9af7cbd69301b7f8a183bc7d2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtwbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.132107 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53a74fde-24f5-468b-91c3-376086a65984\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9c26dade4bc16ed52262df7430a5f246e2545ea57bbd2c508b9bae8e63eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0d5b4b9ef65007249726829546a32433da7182af26880f9fa147d78ab3c4e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:34:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0547ae6824b8e245f2d4dacafe01b649988cdc11397acbf6d5ef4cd9f34431ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T15:34:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T15:34:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:34:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.144620 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.157213 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d95b40fa1bd878ea8632628ec06ed38ea8600d1261999383082ad8b05afc7fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.170208 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:52 crc kubenswrapper[4792]: I0318 15:36:52.184820 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tbjvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f709bbd-6cce-421b-90fe-8c9047004002\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T15:35:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e1415fb5086f3caf2ed2d1e36125ea66b83077592f67135178114fa04ff6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T15:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vjlg8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T15:35:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tbjvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:52Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:53 crc kubenswrapper[4792]: I0318 15:36:53.854081 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:53 crc kubenswrapper[4792]: I0318 15:36:53.854225 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:53 crc kubenswrapper[4792]: I0318 15:36:53.854321 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:53 crc kubenswrapper[4792]: I0318 15:36:53.854428 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:53 crc kubenswrapper[4792]: E0318 15:36:53.855150 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:53 crc kubenswrapper[4792]: I0318 15:36:53.855206 4792 scope.go:117] "RemoveContainer" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" Mar 18 15:36:53 crc kubenswrapper[4792]: E0318 15:36:53.855373 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:36:53 crc kubenswrapper[4792]: E0318 15:36:53.855469 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:53 crc kubenswrapper[4792]: E0318 15:36:53.855548 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:53 crc kubenswrapper[4792]: E0318 15:36:53.855640 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.100003 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.100070 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.100084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.100105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.100121 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:54Z","lastTransitionTime":"2026-03-18T15:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:54 crc kubenswrapper[4792]: E0318 15:36:54.120960 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.127051 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.127125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.127143 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.127165 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.127228 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:54Z","lastTransitionTime":"2026-03-18T15:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:54 crc kubenswrapper[4792]: E0318 15:36:54.153613 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.159965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.160066 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.160085 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.160109 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.160126 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:54Z","lastTransitionTime":"2026-03-18T15:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:54 crc kubenswrapper[4792]: E0318 15:36:54.181352 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.186844 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.186900 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.186942 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.186959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.186982 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:54Z","lastTransitionTime":"2026-03-18T15:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:54 crc kubenswrapper[4792]: E0318 15:36:54.204611 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.210138 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.210191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.210209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.210235 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:36:54 crc kubenswrapper[4792]: I0318 15:36:54.210254 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:36:54Z","lastTransitionTime":"2026-03-18T15:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:36:54 crc kubenswrapper[4792]: E0318 15:36:54.231043 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T15:36:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"85dc16d2-3bef-455f-aff6-17a99cc51456\\\",\\\"systemUUID\\\":\\\"23b8af29-cf8d-424d-a7c3-490f2387f9d8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T15:36:54Z is after 2025-08-24T17:21:41Z" Mar 18 15:36:54 crc kubenswrapper[4792]: E0318 15:36:54.231296 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 15:36:55 crc kubenswrapper[4792]: I0318 15:36:55.853705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:55 crc kubenswrapper[4792]: I0318 15:36:55.853781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:55 crc kubenswrapper[4792]: I0318 15:36:55.854188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:55 crc kubenswrapper[4792]: I0318 15:36:55.854215 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:55 crc kubenswrapper[4792]: E0318 15:36:55.854321 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:55 crc kubenswrapper[4792]: E0318 15:36:55.854774 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:55 crc kubenswrapper[4792]: E0318 15:36:55.854913 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:55 crc kubenswrapper[4792]: E0318 15:36:55.855106 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:56 crc kubenswrapper[4792]: E0318 15:36:56.967010 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:36:57 crc kubenswrapper[4792]: I0318 15:36:57.854306 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:57 crc kubenswrapper[4792]: I0318 15:36:57.854464 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:57 crc kubenswrapper[4792]: I0318 15:36:57.854560 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:57 crc kubenswrapper[4792]: E0318 15:36:57.854670 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:57 crc kubenswrapper[4792]: I0318 15:36:57.854587 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:57 crc kubenswrapper[4792]: E0318 15:36:57.854768 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:57 crc kubenswrapper[4792]: E0318 15:36:57.855028 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:57 crc kubenswrapper[4792]: E0318 15:36:57.855073 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:36:59 crc kubenswrapper[4792]: I0318 15:36:59.853679 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:36:59 crc kubenswrapper[4792]: I0318 15:36:59.853727 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:36:59 crc kubenswrapper[4792]: I0318 15:36:59.853820 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:36:59 crc kubenswrapper[4792]: E0318 15:36:59.853947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:36:59 crc kubenswrapper[4792]: E0318 15:36:59.854090 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:36:59 crc kubenswrapper[4792]: I0318 15:36:59.854148 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:36:59 crc kubenswrapper[4792]: E0318 15:36:59.854246 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:36:59 crc kubenswrapper[4792]: E0318 15:36:59.854324 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:01 crc kubenswrapper[4792]: I0318 15:37:01.853336 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:01 crc kubenswrapper[4792]: I0318 15:37:01.853393 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:01 crc kubenswrapper[4792]: I0318 15:37:01.853344 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:01 crc kubenswrapper[4792]: E0318 15:37:01.853553 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:01 crc kubenswrapper[4792]: I0318 15:37:01.853630 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:01 crc kubenswrapper[4792]: E0318 15:37:01.853795 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:01 crc kubenswrapper[4792]: E0318 15:37:01.853913 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:01 crc kubenswrapper[4792]: E0318 15:37:01.854136 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:01 crc kubenswrapper[4792]: E0318 15:37:01.967565 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:37:01 crc kubenswrapper[4792]: I0318 15:37:01.979892 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.979866823 podStartE2EDuration="1m22.979866823s" podCreationTimestamp="2026-03-18 15:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:01.956985244 +0000 UTC m=+170.826314181" watchObservedRunningTime="2026-03-18 15:37:01.979866823 +0000 UTC m=+170.849195760" Mar 18 15:37:01 crc kubenswrapper[4792]: I0318 15:37:01.980084 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=53.980078331 podStartE2EDuration="53.980078331s" podCreationTimestamp="2026-03-18 15:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:01.979747029 +0000 UTC m=+170.849075976" watchObservedRunningTime="2026-03-18 15:37:01.980078331 +0000 UTC m=+170.849407268" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.019163 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=80.019133723 podStartE2EDuration="1m20.019133723s" podCreationTimestamp="2026-03-18 15:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:01.997367945 +0000 UTC m=+170.866696882" watchObservedRunningTime="2026-03-18 15:37:02.019133723 +0000 UTC m=+170.888462660" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.019450 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.019445425 podStartE2EDuration="1m8.019445425s" podCreationTimestamp="2026-03-18 15:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.018697807 +0000 UTC m=+170.888026754" watchObservedRunningTime="2026-03-18 15:37:02.019445425 +0000 UTC m=+170.888774362" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.046595 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fqr6h" podStartSLOduration=123.046572559 podStartE2EDuration="2m3.046572559s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.034244148 +0000 UTC m=+170.903573105" watchObservedRunningTime="2026-03-18 15:37:02.046572559 +0000 UTC m=+170.915901496" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.047240 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-772vs" podStartSLOduration=123.047234914 podStartE2EDuration="2m3.047234914s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.046748226 +0000 UTC m=+170.916077183" watchObservedRunningTime="2026-03-18 15:37:02.047234914 +0000 UTC m=+170.916563851" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.091433 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podStartSLOduration=123.091411743 podStartE2EDuration="2m3.091411743s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.090997758 +0000 UTC m=+170.960326695" watchObservedRunningTime="2026-03-18 15:37:02.091411743 +0000 UTC m=+170.960740700" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.114018 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qnps4" podStartSLOduration=123.113989291 podStartE2EDuration="2m3.113989291s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.109803529 +0000 UTC m=+170.979132466" watchObservedRunningTime="2026-03-18 15:37:02.113989291 +0000 UTC m=+170.983318238" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.155048 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tbjvb" podStartSLOduration=123.155029836 podStartE2EDuration="2m3.155029836s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.142137984 +0000 UTC m=+171.011466921" watchObservedRunningTime="2026-03-18 15:37:02.155029836 +0000 UTC m=+171.024358783" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.176174 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8lmh" podStartSLOduration=123.176157952 podStartE2EDuration="2m3.176157952s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.176046717 +0000 UTC m=+171.045375674" watchObservedRunningTime="2026-03-18 15:37:02.176157952 +0000 UTC m=+171.045486889" Mar 18 15:37:02 crc kubenswrapper[4792]: I0318 15:37:02.192879 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.192863044 podStartE2EDuration="39.192863044s" podCreationTimestamp="2026-03-18 15:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:02.192117107 +0000 UTC m=+171.061446084" watchObservedRunningTime="2026-03-18 15:37:02.192863044 +0000 UTC m=+171.062191981" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.642791 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/1.log" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.644511 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/0.log" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.644814 4792 generic.go:334] "Generic (PLEG): container finished" podID="241b9e3f-bd41-4fb2-a68a-9395a67feaae" containerID="b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914" exitCode=1 Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.644929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerDied","Data":"b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914"} Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.645313 4792 scope.go:117] "RemoveContainer" containerID="ffbf0a2094211808c755f6ad2c162dff2897b0a96ef214a0a88694cfbc8180a2" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.645880 4792 scope.go:117] "RemoveContainer" containerID="b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914" Mar 18 15:37:03 crc kubenswrapper[4792]: E0318 15:37:03.646151 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-fqr6h_openshift-multus(241b9e3f-bd41-4fb2-a68a-9395a67feaae)\"" pod="openshift-multus/multus-fqr6h" podUID="241b9e3f-bd41-4fb2-a68a-9395a67feaae" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.856154 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:03 crc kubenswrapper[4792]: E0318 15:37:03.856301 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.856455 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.856477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:03 crc kubenswrapper[4792]: E0318 15:37:03.856684 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:03 crc kubenswrapper[4792]: E0318 15:37:03.856791 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:03 crc kubenswrapper[4792]: I0318 15:37:03.857014 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:03 crc kubenswrapper[4792]: E0318 15:37:03.857656 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.376279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.376353 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.376372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.376400 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.376423 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T15:37:04Z","lastTransitionTime":"2026-03-18T15:37:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.438786 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg"] Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.439470 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.443194 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.443857 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.444080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.444204 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.584129 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9190371d-504d-415f-b7c1-4a84c4b7ff31-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.584228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9190371d-504d-415f-b7c1-4a84c4b7ff31-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.584282 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9190371d-504d-415f-b7c1-4a84c4b7ff31-service-ca\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.584323 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9190371d-504d-415f-b7c1-4a84c4b7ff31-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.584382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9190371d-504d-415f-b7c1-4a84c4b7ff31-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.652511 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/1.log" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.685280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9190371d-504d-415f-b7c1-4a84c4b7ff31-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.685365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9190371d-504d-415f-b7c1-4a84c4b7ff31-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.685420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9190371d-504d-415f-b7c1-4a84c4b7ff31-service-ca\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.685461 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9190371d-504d-415f-b7c1-4a84c4b7ff31-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.685522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9190371d-504d-415f-b7c1-4a84c4b7ff31-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.685557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9190371d-504d-415f-b7c1-4a84c4b7ff31-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.685704 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9190371d-504d-415f-b7c1-4a84c4b7ff31-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.687064 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9190371d-504d-415f-b7c1-4a84c4b7ff31-service-ca\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.695174 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9190371d-504d-415f-b7c1-4a84c4b7ff31-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.711205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9190371d-504d-415f-b7c1-4a84c4b7ff31-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-49ggg\" (UID: \"9190371d-504d-415f-b7c1-4a84c4b7ff31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.759672 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.926493 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 15:37:04 crc kubenswrapper[4792]: I0318 15:37:04.939327 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 15:37:05 crc kubenswrapper[4792]: I0318 15:37:05.658951 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" event={"ID":"9190371d-504d-415f-b7c1-4a84c4b7ff31","Type":"ContainerStarted","Data":"fa9d71f49bf7122c5379b8f0e0bfc277dc832e1c8f87b60ac32ec02cc430d40c"} Mar 18 15:37:05 crc kubenswrapper[4792]: I0318 15:37:05.659708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" event={"ID":"9190371d-504d-415f-b7c1-4a84c4b7ff31","Type":"ContainerStarted","Data":"18df0567a1861f8503fd386eacb70ba42eb78a22fcc4e27e4a82f634ec7d9618"} Mar 18 15:37:05 crc kubenswrapper[4792]: I0318 15:37:05.677836 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-49ggg" podStartSLOduration=126.677807246 podStartE2EDuration="2m6.677807246s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:05.676446677 +0000 UTC m=+174.545775634" watchObservedRunningTime="2026-03-18 15:37:05.677807246 +0000 UTC m=+174.547136193" Mar 18 15:37:05 crc kubenswrapper[4792]: I0318 15:37:05.853410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:05 crc kubenswrapper[4792]: I0318 15:37:05.853422 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:05 crc kubenswrapper[4792]: I0318 15:37:05.853477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:05 crc kubenswrapper[4792]: E0318 15:37:05.854335 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:05 crc kubenswrapper[4792]: I0318 15:37:05.853700 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:05 crc kubenswrapper[4792]: E0318 15:37:05.853942 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:05 crc kubenswrapper[4792]: E0318 15:37:05.854408 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:05 crc kubenswrapper[4792]: E0318 15:37:05.854611 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:06 crc kubenswrapper[4792]: I0318 15:37:06.854872 4792 scope.go:117] "RemoveContainer" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" Mar 18 15:37:06 crc kubenswrapper[4792]: E0318 15:37:06.855161 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4pndk_openshift-ovn-kubernetes(4e6f23ed-13da-466a-8c55-1043d6e0b748)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" Mar 18 15:37:06 crc kubenswrapper[4792]: E0318 15:37:06.969041 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:37:07 crc kubenswrapper[4792]: I0318 15:37:07.853725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:07 crc kubenswrapper[4792]: I0318 15:37:07.853868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:07 crc kubenswrapper[4792]: I0318 15:37:07.853799 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:07 crc kubenswrapper[4792]: E0318 15:37:07.854059 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:07 crc kubenswrapper[4792]: I0318 15:37:07.854083 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:07 crc kubenswrapper[4792]: E0318 15:37:07.854221 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:07 crc kubenswrapper[4792]: E0318 15:37:07.854335 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:07 crc kubenswrapper[4792]: E0318 15:37:07.854509 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:09 crc kubenswrapper[4792]: I0318 15:37:09.853687 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:09 crc kubenswrapper[4792]: I0318 15:37:09.853822 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:09 crc kubenswrapper[4792]: I0318 15:37:09.853899 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:09 crc kubenswrapper[4792]: I0318 15:37:09.854317 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:09 crc kubenswrapper[4792]: E0318 15:37:09.854365 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:09 crc kubenswrapper[4792]: E0318 15:37:09.854441 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:09 crc kubenswrapper[4792]: E0318 15:37:09.854513 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:09 crc kubenswrapper[4792]: E0318 15:37:09.854591 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:11 crc kubenswrapper[4792]: I0318 15:37:11.854055 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:11 crc kubenswrapper[4792]: I0318 15:37:11.854059 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:11 crc kubenswrapper[4792]: I0318 15:37:11.854084 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:11 crc kubenswrapper[4792]: I0318 15:37:11.854128 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:11 crc kubenswrapper[4792]: E0318 15:37:11.855835 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:11 crc kubenswrapper[4792]: E0318 15:37:11.856027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:11 crc kubenswrapper[4792]: E0318 15:37:11.856152 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:11 crc kubenswrapper[4792]: E0318 15:37:11.856368 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:11 crc kubenswrapper[4792]: E0318 15:37:11.969859 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:37:13 crc kubenswrapper[4792]: I0318 15:37:13.854378 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:13 crc kubenswrapper[4792]: I0318 15:37:13.854455 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:13 crc kubenswrapper[4792]: I0318 15:37:13.854376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:13 crc kubenswrapper[4792]: I0318 15:37:13.854558 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:13 crc kubenswrapper[4792]: E0318 15:37:13.854769 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:13 crc kubenswrapper[4792]: E0318 15:37:13.855063 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:13 crc kubenswrapper[4792]: E0318 15:37:13.855171 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:13 crc kubenswrapper[4792]: E0318 15:37:13.855277 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:14 crc kubenswrapper[4792]: I0318 15:37:14.853840 4792 scope.go:117] "RemoveContainer" containerID="b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914" Mar 18 15:37:15 crc kubenswrapper[4792]: I0318 15:37:15.693854 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/1.log" Mar 18 15:37:15 crc kubenswrapper[4792]: I0318 15:37:15.693912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerStarted","Data":"8ebf218f5e63c5d2034c6d6faf2b47bc35d407346d3717228f32e837d2a59217"} Mar 18 15:37:15 crc kubenswrapper[4792]: I0318 15:37:15.853428 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:15 crc kubenswrapper[4792]: I0318 15:37:15.853496 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:15 crc kubenswrapper[4792]: I0318 15:37:15.853496 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:15 crc kubenswrapper[4792]: I0318 15:37:15.853448 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:15 crc kubenswrapper[4792]: E0318 15:37:15.853581 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:15 crc kubenswrapper[4792]: E0318 15:37:15.853703 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:15 crc kubenswrapper[4792]: E0318 15:37:15.853784 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:15 crc kubenswrapper[4792]: E0318 15:37:15.853940 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:16 crc kubenswrapper[4792]: E0318 15:37:16.971461 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:37:17 crc kubenswrapper[4792]: I0318 15:37:17.854269 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:17 crc kubenswrapper[4792]: I0318 15:37:17.854373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:17 crc kubenswrapper[4792]: I0318 15:37:17.854489 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:17 crc kubenswrapper[4792]: E0318 15:37:17.854482 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:17 crc kubenswrapper[4792]: E0318 15:37:17.854687 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:17 crc kubenswrapper[4792]: I0318 15:37:17.854759 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:17 crc kubenswrapper[4792]: E0318 15:37:17.854792 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:17 crc kubenswrapper[4792]: E0318 15:37:17.854955 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:18 crc kubenswrapper[4792]: I0318 15:37:18.854639 4792 scope.go:117] "RemoveContainer" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.711051 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/3.log" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.714126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerStarted","Data":"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217"} Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.714615 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.747519 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podStartSLOduration=140.747502603 podStartE2EDuration="2m20.747502603s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:19.747182062 +0000 UTC m=+188.616511029" watchObservedRunningTime="2026-03-18 15:37:19.747502603 +0000 UTC m=+188.616831540" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.854242 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.854317 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.854347 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.854373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:19 crc kubenswrapper[4792]: E0318 15:37:19.854422 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:19 crc kubenswrapper[4792]: E0318 15:37:19.854553 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:19 crc kubenswrapper[4792]: E0318 15:37:19.854711 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:19 crc kubenswrapper[4792]: E0318 15:37:19.854882 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:19 crc kubenswrapper[4792]: I0318 15:37:19.877403 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rpvb6"] Mar 18 15:37:20 crc kubenswrapper[4792]: I0318 15:37:20.717783 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:20 crc kubenswrapper[4792]: E0318 15:37:20.717932 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:21 crc kubenswrapper[4792]: I0318 15:37:21.853249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:21 crc kubenswrapper[4792]: I0318 15:37:21.853317 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:21 crc kubenswrapper[4792]: I0318 15:37:21.853278 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:21 crc kubenswrapper[4792]: E0318 15:37:21.855664 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:21 crc kubenswrapper[4792]: E0318 15:37:21.856060 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:21 crc kubenswrapper[4792]: E0318 15:37:21.855945 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:21 crc kubenswrapper[4792]: E0318 15:37:21.972227 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:37:22 crc kubenswrapper[4792]: I0318 15:37:22.853247 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:22 crc kubenswrapper[4792]: E0318 15:37:22.853458 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:23 crc kubenswrapper[4792]: I0318 15:37:23.854260 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:23 crc kubenswrapper[4792]: I0318 15:37:23.854368 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:23 crc kubenswrapper[4792]: E0318 15:37:23.854428 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:23 crc kubenswrapper[4792]: I0318 15:37:23.854517 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:23 crc kubenswrapper[4792]: E0318 15:37:23.854614 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:23 crc kubenswrapper[4792]: E0318 15:37:23.854709 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:24 crc kubenswrapper[4792]: I0318 15:37:24.853940 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:24 crc kubenswrapper[4792]: E0318 15:37:24.854154 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:25 crc kubenswrapper[4792]: I0318 15:37:25.854322 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:25 crc kubenswrapper[4792]: E0318 15:37:25.854748 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 15:37:25 crc kubenswrapper[4792]: I0318 15:37:25.854397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:25 crc kubenswrapper[4792]: E0318 15:37:25.854831 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 15:37:25 crc kubenswrapper[4792]: I0318 15:37:25.854322 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:25 crc kubenswrapper[4792]: E0318 15:37:25.854893 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 15:37:26 crc kubenswrapper[4792]: I0318 15:37:26.853752 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:26 crc kubenswrapper[4792]: E0318 15:37:26.853953 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpvb6" podUID="f6d7b0a3-b8fe-49f9-91ad-ae46796becbc" Mar 18 15:37:27 crc kubenswrapper[4792]: I0318 15:37:27.853691 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:27 crc kubenswrapper[4792]: I0318 15:37:27.853747 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:27 crc kubenswrapper[4792]: I0318 15:37:27.853826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:27 crc kubenswrapper[4792]: I0318 15:37:27.858601 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 15:37:27 crc kubenswrapper[4792]: I0318 15:37:27.858932 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 15:37:27 crc kubenswrapper[4792]: I0318 15:37:27.859021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 15:37:27 crc kubenswrapper[4792]: I0318 15:37:27.859384 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 15:37:28 crc kubenswrapper[4792]: I0318 15:37:28.853853 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:28 crc kubenswrapper[4792]: I0318 15:37:28.856957 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:37:28 crc kubenswrapper[4792]: I0318 15:37:28.857080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:37:30 crc kubenswrapper[4792]: I0318 15:37:30.321495 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:37:30 crc kubenswrapper[4792]: I0318 15:37:30.321575 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:37:30 crc kubenswrapper[4792]: I0318 15:37:30.321720 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.788422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.835275 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.835598 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.837472 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2m75"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.837695 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.839793 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.840270 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.840468 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.841132 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.841175 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.841677 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.841682 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.841929 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.842321 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.842329 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.842814 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.867262 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.867932 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.868222 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t7nc8"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.868802 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.871166 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.871807 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.872068 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svr4m"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.880031 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.883742 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-gs8sw"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.896168 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.896235 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.896317 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.896464 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.896489 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.896581 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.896780 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.897841 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.897963 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9j5pp"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.898128 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.898324 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.898666 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.898689 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.898882 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.899326 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.899518 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.899793 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.900060 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.901459 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.901604 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.902220 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.902295 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.904883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.905316 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pk6tp"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.905673 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.905877 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.906044 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.906466 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xknnv"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.906853 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.907891 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dkkhx"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.908723 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.912274 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.912835 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ltvvk"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.913163 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.913699 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.914187 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.914493 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ltvvk" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.915108 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bt6v4"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.915838 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.918230 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.918843 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.919099 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2m75"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.920027 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlsn2"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.920496 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.922228 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.922764 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tfth7"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.923254 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.923522 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.927697 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.928424 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.933409 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.933645 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.933654 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.933789 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.933917 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934019 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934092 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934169 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934252 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934406 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934419 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.933907 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c323f5d1-220a-41eb-a3db-e56dedfafc29-serving-cert\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934549 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934553 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934572 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934217 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934664 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934687 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-service-ca\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934814 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a74161a-cf63-45fe-b306-aa335ccd309e-auth-proxy-config\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934859 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934876 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8be2d62-1906-4029-91b4-7dd4d2197491-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2l2b\" (UniqueName: \"kubernetes.io/projected/dd12c69c-a021-4a8a-a5e4-3034aa92f62c-kube-api-access-z2l2b\") pod \"dns-operator-744455d44c-9j5pp\" (UID: \"dd12c69c-a021-4a8a-a5e4-3034aa92f62c\") " pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934898 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934932 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934816 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934901 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.934931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc954\" (UniqueName: \"kubernetes.io/projected/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-kube-api-access-xc954\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935078 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd12c69c-a021-4a8a-a5e4-3034aa92f62c-metrics-tls\") pod \"dns-operator-744455d44c-9j5pp\" (UID: \"dd12c69c-a021-4a8a-a5e4-3034aa92f62c\") " pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935097 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-policies\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935113 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935118 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn67g\" (UniqueName: \"kubernetes.io/projected/c323f5d1-220a-41eb-a3db-e56dedfafc29-kube-api-access-dn67g\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935219 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9c5m\" (UniqueName: \"kubernetes.io/projected/d8be2d62-1906-4029-91b4-7dd4d2197491-kube-api-access-z9c5m\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935253 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935275 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88t6\" (UniqueName: \"kubernetes.io/projected/c51bb642-174a-4bb3-8a20-1708d490a17d-kube-api-access-w88t6\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935309 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8be2d62-1906-4029-91b4-7dd4d2197491-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935324 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf89w\" (UniqueName: \"kubernetes.io/projected/d4c0c858-0632-49da-b6d4-1b2e9f84f690-kube-api-access-sf89w\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935339 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v86c5\" (UniqueName: \"kubernetes.io/projected/be67fd32-9417-44ea-b20d-0af9897fea35-kube-api-access-v86c5\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-client-ca\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935387 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-serving-cert\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935433 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935457 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a74161a-cf63-45fe-b306-aa335ccd309e-config\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-dir\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kntgf\" (UniqueName: \"kubernetes.io/projected/089a8476-5fcb-4378-8113-4a7162685b16-kube-api-access-kntgf\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935537 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-config\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be67fd32-9417-44ea-b20d-0af9897fea35-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935598 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935622 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8be2d62-1906-4029-91b4-7dd4d2197491-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-trusted-ca-bundle\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935689 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-oauth-serving-cert\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsn9\" (UniqueName: \"kubernetes.io/projected/7a74161a-cf63-45fe-b306-aa335ccd309e-kube-api-access-dgsn9\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935738 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935770 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935816 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4c0c858-0632-49da-b6d4-1b2e9f84f690-images\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935847 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c0c858-0632-49da-b6d4-1b2e9f84f690-config\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.935965 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-console-config\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936029 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be67fd32-9417-44ea-b20d-0af9897fea35-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936057 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-oauth-config\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936084 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936124 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7a74161a-cf63-45fe-b306-aa335ccd309e-machine-approver-tls\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936174 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4c0c858-0632-49da-b6d4-1b2e9f84f690-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936563 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936822 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.936926 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.937069 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.937202 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.938143 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.938765 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.939381 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.939657 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.939697 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.939789 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t7nc8"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.940014 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.940430 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.940439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.940688 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.940932 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.945377 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.945874 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.947257 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.947552 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.947942 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.948409 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.948489 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.967252 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.967464 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.967599 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.967724 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.968854 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.969582 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.969928 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.970424 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.970780 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.970982 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.971505 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.971729 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.971741 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.971916 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.972181 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.972202 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.973678 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.973762 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.974107 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.974222 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.975255 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.975533 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.976346 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.976556 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.976690 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.976815 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.976917 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.977046 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.977919 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.981578 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.982544 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.982796 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.982947 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.985600 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.985944 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.986158 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.986420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.990089 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.992316 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb"] Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.992937 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:34 crc kubenswrapper[4792]: I0318 15:37:34.999789 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.000662 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.008880 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.009678 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.010846 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lt4p5"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.011462 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.015335 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.015497 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.016124 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.016630 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.023585 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l8284"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.025859 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.026301 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.026743 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.033410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.037158 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.039491 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.040220 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041387 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041440 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-oauth-serving-cert\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsn9\" (UniqueName: \"kubernetes.io/projected/7a74161a-cf63-45fe-b306-aa335ccd309e-kube-api-access-dgsn9\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041577 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f58647de-0442-48c1-9647-53951b40db35-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041629 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041653 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-client\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041684 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57f5df54-714a-4cce-970a-7069ffd1cb63-serving-cert\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-config\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.041938 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-ca\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.042421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b31b4b93-c90d-498b-8136-247c40d9fee2-metrics-tls\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.042491 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqr6s\" (UniqueName: \"kubernetes.io/projected/c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe-kube-api-access-xqr6s\") pod \"migrator-59844c95c7-ckcd4\" (UID: \"c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.042852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4c0c858-0632-49da-b6d4-1b2e9f84f690-images\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.043090 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.043166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c0c858-0632-49da-b6d4-1b2e9f84f690-config\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.043262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-console-config\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.043328 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-service-ca-bundle\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.043611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be67fd32-9417-44ea-b20d-0af9897fea35-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.043660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-serving-cert\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.044009 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-oauth-serving-cert\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.047069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.047948 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.048318 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4c0c858-0632-49da-b6d4-1b2e9f84f690-images\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.048627 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4c0c858-0632-49da-b6d4-1b2e9f84f690-config\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.048644 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.049173 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-oauth-config\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.049227 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.050456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-console-config\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.050648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7a74161a-cf63-45fe-b306-aa335ccd309e-machine-approver-tls\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.050722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4c0c858-0632-49da-b6d4-1b2e9f84f690-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051087 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051161 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051189 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xknnv"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051202 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be67fd32-9417-44ea-b20d-0af9897fea35-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051317 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0c8715a-4133-49bd-b48f-12377582b8ce-serving-cert\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051554 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c323f5d1-220a-41eb-a3db-e56dedfafc29-serving-cert\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051617 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-tmpfs\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051634 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31b4b93-c90d-498b-8136-247c40d9fee2-trusted-ca\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051689 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051710 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wmn\" (UniqueName: \"kubernetes.io/projected/b31b4b93-c90d-498b-8136-247c40d9fee2-kube-api-access-r7wmn\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051772 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-etcd-client\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-service-ca\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051841 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a74161a-cf63-45fe-b306-aa335ccd309e-auth-proxy-config\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051883 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b31b4b93-c90d-498b-8136-247c40d9fee2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051919 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-audit-policies\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8be2d62-1906-4029-91b4-7dd4d2197491-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.051997 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2l2b\" (UniqueName: \"kubernetes.io/projected/dd12c69c-a021-4a8a-a5e4-3034aa92f62c-kube-api-access-z2l2b\") pod \"dns-operator-744455d44c-9j5pp\" (UID: \"dd12c69c-a021-4a8a-a5e4-3034aa92f62c\") " pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58647de-0442-48c1-9647-53951b40db35-config\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-apiservice-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052074 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052149 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-webhook-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r84w\" (UniqueName: \"kubernetes.io/projected/ac30b157-a927-4aa3-898d-917f8efd8338-kube-api-access-2r84w\") pod \"cluster-samples-operator-665b6dd947-85pml\" (UID: \"ac30b157-a927-4aa3-898d-917f8efd8338\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052186 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82x8h\" (UniqueName: \"kubernetes.io/projected/6a2d17c6-a211-4f41-89a0-a80ece425ed7-kube-api-access-82x8h\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052299 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc954\" (UniqueName: \"kubernetes.io/projected/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-kube-api-access-xc954\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052333 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052343 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd12c69c-a021-4a8a-a5e4-3034aa92f62c-metrics-tls\") pod \"dns-operator-744455d44c-9j5pp\" (UID: \"dd12c69c-a021-4a8a-a5e4-3034aa92f62c\") " pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-policies\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f58647de-0442-48c1-9647-53951b40db35-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052456 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e0c8715a-4133-49bd-b48f-12377582b8ce-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn67g\" (UniqueName: \"kubernetes.io/projected/c323f5d1-220a-41eb-a3db-e56dedfafc29-kube-api-access-dn67g\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052520 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9c5m\" (UniqueName: \"kubernetes.io/projected/d8be2d62-1906-4029-91b4-7dd4d2197491-kube-api-access-z9c5m\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-encryption-config\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052572 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a2d17c6-a211-4f41-89a0-a80ece425ed7-audit-dir\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052597 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5hk6\" (UniqueName: \"kubernetes.io/projected/be2f97af-46a9-455f-bc78-9b25a736d1f5-kube-api-access-s5hk6\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052625 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052653 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052679 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88t6\" (UniqueName: \"kubernetes.io/projected/c51bb642-174a-4bb3-8a20-1708d490a17d-kube-api-access-w88t6\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8be2d62-1906-4029-91b4-7dd4d2197491-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052720 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-service-ca\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf89w\" (UniqueName: \"kubernetes.io/projected/d4c0c858-0632-49da-b6d4-1b2e9f84f690-kube-api-access-sf89w\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.052768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v86c5\" (UniqueName: \"kubernetes.io/projected/be67fd32-9417-44ea-b20d-0af9897fea35-kube-api-access-v86c5\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.053092 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-policies\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.053407 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be2f97af-46a9-455f-bc78-9b25a736d1f5-serving-cert\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.053523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-client-ca\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.054609 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.054268 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrqhk"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.054791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.054689 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nllkp\" (UniqueName: \"kubernetes.io/projected/e0c8715a-4133-49bd-b48f-12377582b8ce-kube-api-access-nllkp\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-serving-cert\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055091 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a74161a-cf63-45fe-b306-aa335ccd309e-config\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-dir\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055233 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gbtq\" (UniqueName: \"kubernetes.io/projected/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-kube-api-access-4gbtq\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac30b157-a927-4aa3-898d-917f8efd8338-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-85pml\" (UID: \"ac30b157-a927-4aa3-898d-917f8efd8338\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-config\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055329 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-service-ca\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055359 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kntgf\" (UniqueName: \"kubernetes.io/projected/089a8476-5fcb-4378-8113-4a7162685b16-kube-api-access-kntgf\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-config\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055358 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-client-ca\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be67fd32-9417-44ea-b20d-0af9897fea35-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8be2d62-1906-4029-91b4-7dd4d2197491-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055521 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrs28\" (UniqueName: \"kubernetes.io/projected/57f5df54-714a-4cce-970a-7069ffd1cb63-kube-api-access-lrs28\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055551 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-trusted-ca-bundle\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4c0c858-0632-49da-b6d4-1b2e9f84f690-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.055682 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.056772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a74161a-cf63-45fe-b306-aa335ccd309e-config\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.056792 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c323f5d1-220a-41eb-a3db-e56dedfafc29-serving-cert\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.056812 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-dir\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.057170 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-trusted-ca-bundle\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.057969 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.058379 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.058406 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-config\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.058844 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.059443 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.060102 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a74161a-cf63-45fe-b306-aa335ccd309e-auth-proxy-config\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.060501 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.063844 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.065820 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.068804 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be67fd32-9417-44ea-b20d-0af9897fea35-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.069254 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd12c69c-a021-4a8a-a5e4-3034aa92f62c-metrics-tls\") pod \"dns-operator-744455d44c-9j5pp\" (UID: \"dd12c69c-a021-4a8a-a5e4-3034aa92f62c\") " pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.069338 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-serving-cert\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.069580 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.069773 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-oauth-config\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.069955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.070107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.070119 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.070563 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.070732 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8be2d62-1906-4029-91b4-7dd4d2197491-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.074245 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7a74161a-cf63-45fe-b306-aa335ccd309e-machine-approver-tls\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.074316 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dkkhx"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.076393 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564136-wt69c"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.077203 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564136-wt69c" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.078183 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.079281 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.079793 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b6vrp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.080773 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.081193 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9j5pp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.083252 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ltvvk"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.084352 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svr4m"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.085306 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.086801 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bt6v4"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.087798 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.088822 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.089869 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.090903 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.091968 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.092957 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.094019 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gs8sw"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.095111 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.096389 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.097784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.097929 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-45c2x"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.098011 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.098568 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8be2d62-1906-4029-91b4-7dd4d2197491-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.099295 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.100862 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qw2mp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.106130 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlsn2"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.106342 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.106303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.107850 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lt4p5"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.109094 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.113268 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.114641 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qw2mp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.116121 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pk6tp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.117208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.117352 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.118645 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.119865 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.121932 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.123047 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b6vrp"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.130979 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.131283 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.133748 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrqhk"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.134861 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564136-wt69c"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.135928 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l8284"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.136959 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.137215 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.138725 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.139840 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h4rq9"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.140875 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.141233 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h4rq9"] Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156107 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-config\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156147 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-ca\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b31b4b93-c90d-498b-8136-247c40d9fee2-metrics-tls\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqr6s\" (UniqueName: \"kubernetes.io/projected/c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe-kube-api-access-xqr6s\") pod \"migrator-59844c95c7-ckcd4\" (UID: \"c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156255 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-service-ca-bundle\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-serving-cert\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156332 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156394 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0c8715a-4133-49bd-b48f-12377582b8ce-serving-cert\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-tmpfs\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31b4b93-c90d-498b-8136-247c40d9fee2-trusted-ca\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wmn\" (UniqueName: \"kubernetes.io/projected/b31b4b93-c90d-498b-8136-247c40d9fee2-kube-api-access-r7wmn\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156591 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-etcd-client\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156658 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b31b4b93-c90d-498b-8136-247c40d9fee2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-audit-policies\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156905 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58647de-0442-48c1-9647-53951b40db35-config\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156921 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-config\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.156986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-apiservice-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-webhook-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r84w\" (UniqueName: \"kubernetes.io/projected/ac30b157-a927-4aa3-898d-917f8efd8338-kube-api-access-2r84w\") pod \"cluster-samples-operator-665b6dd947-85pml\" (UID: \"ac30b157-a927-4aa3-898d-917f8efd8338\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157136 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82x8h\" (UniqueName: \"kubernetes.io/projected/6a2d17c6-a211-4f41-89a0-a80ece425ed7-kube-api-access-82x8h\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157173 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e0c8715a-4133-49bd-b48f-12377582b8ce-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f58647de-0442-48c1-9647-53951b40db35-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157344 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-encryption-config\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157406 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a2d17c6-a211-4f41-89a0-a80ece425ed7-audit-dir\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5hk6\" (UniqueName: \"kubernetes.io/projected/be2f97af-46a9-455f-bc78-9b25a736d1f5-kube-api-access-s5hk6\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157554 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be2f97af-46a9-455f-bc78-9b25a736d1f5-serving-cert\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157592 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nllkp\" (UniqueName: \"kubernetes.io/projected/e0c8715a-4133-49bd-b48f-12377582b8ce-kube-api-access-nllkp\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gbtq\" (UniqueName: \"kubernetes.io/projected/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-kube-api-access-4gbtq\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157667 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac30b157-a927-4aa3-898d-917f8efd8338-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-85pml\" (UID: \"ac30b157-a927-4aa3-898d-917f8efd8338\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-service-ca\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157717 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-config\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-service-ca-bundle\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157765 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrs28\" (UniqueName: \"kubernetes.io/projected/57f5df54-714a-4cce-970a-7069ffd1cb63-kube-api-access-lrs28\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f58647de-0442-48c1-9647-53951b40db35-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157864 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-client\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157890 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57f5df54-714a-4cce-970a-7069ffd1cb63-serving-cert\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157981 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-audit-policies\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.158159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.158231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-ca\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.157638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-tmpfs\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.158959 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.160385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-service-ca\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.160397 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0c8715a-4133-49bd-b48f-12377582b8ce-serving-cert\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.160667 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e0c8715a-4133-49bd-b48f-12377582b8ce-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.161244 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5df54-714a-4cce-970a-7069ffd1cb63-config\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.161353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a2d17c6-a211-4f41-89a0-a80ece425ed7-audit-dir\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.161848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be2f97af-46a9-455f-bc78-9b25a736d1f5-etcd-client\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.162801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57f5df54-714a-4cce-970a-7069ffd1cb63-serving-cert\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.163438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be2f97af-46a9-455f-bc78-9b25a736d1f5-serving-cert\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.163717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-encryption-config\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.163803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-etcd-client\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.164265 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a2d17c6-a211-4f41-89a0-a80ece425ed7-serving-cert\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.164417 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac30b157-a927-4aa3-898d-917f8efd8338-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-85pml\" (UID: \"ac30b157-a927-4aa3-898d-917f8efd8338\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.167533 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.177802 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.187845 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a2d17c6-a211-4f41-89a0-a80ece425ed7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.197154 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.217030 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.237702 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.258905 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.278071 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.297616 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.318238 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.338853 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.358323 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.378112 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.397784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.417537 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.437910 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.458020 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.477931 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.484149 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f58647de-0442-48c1-9647-53951b40db35-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.499337 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.509035 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58647de-0442-48c1-9647-53951b40db35-config\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.517830 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.538280 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.558810 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.571272 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b31b4b93-c90d-498b-8136-247c40d9fee2-metrics-tls\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.583984 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.588231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b31b4b93-c90d-498b-8136-247c40d9fee2-trusted-ca\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.597246 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.619141 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.639185 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.658682 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.677455 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.699020 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.717566 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.738431 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.762057 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.778298 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.797837 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.818891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.838615 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.858790 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.877998 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.897041 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.918648 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.957935 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.978997 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.996938 4792 request.go:700] Waited for 1.010215027s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Mar 18 15:37:35 crc kubenswrapper[4792]: I0318 15:37:35.999832 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.018797 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.039335 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.058524 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.078147 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.098579 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.117951 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.138636 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 15:37:36 crc kubenswrapper[4792]: E0318 15:37:36.157515 4792 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.157596 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 15:37:36 crc kubenswrapper[4792]: E0318 15:37:36.157613 4792 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 15:37:36 crc kubenswrapper[4792]: E0318 15:37:36.157661 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-apiservice-cert podName:7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:36.657624349 +0000 UTC m=+205.526953326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-apiservice-cert") pod "packageserver-d55dfcdfc-25vh8" (UID: "7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073") : failed to sync secret cache: timed out waiting for the condition Mar 18 15:37:36 crc kubenswrapper[4792]: E0318 15:37:36.157896 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-webhook-cert podName:7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073 nodeName:}" failed. No retries permitted until 2026-03-18 15:37:36.657733154 +0000 UTC m=+205.527062111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-webhook-cert") pod "packageserver-d55dfcdfc-25vh8" (UID: "7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073") : failed to sync secret cache: timed out waiting for the condition Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.177459 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.217957 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.237936 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.258754 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.279586 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.298686 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.318879 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.338765 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.358497 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.378455 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.397846 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.429446 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.438688 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.458491 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.478195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.498502 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.535588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsn9\" (UniqueName: \"kubernetes.io/projected/7a74161a-cf63-45fe-b306-aa335ccd309e-kube-api-access-dgsn9\") pod \"machine-approver-56656f9798-7xpzl\" (UID: \"7a74161a-cf63-45fe-b306-aa335ccd309e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.558035 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.564273 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2l2b\" (UniqueName: \"kubernetes.io/projected/dd12c69c-a021-4a8a-a5e4-3034aa92f62c-kube-api-access-z2l2b\") pod \"dns-operator-744455d44c-9j5pp\" (UID: \"dd12c69c-a021-4a8a-a5e4-3034aa92f62c\") " pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.593367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn67g\" (UniqueName: \"kubernetes.io/projected/c323f5d1-220a-41eb-a3db-e56dedfafc29-kube-api-access-dn67g\") pod \"controller-manager-879f6c89f-x2m75\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.616061 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf89w\" (UniqueName: \"kubernetes.io/projected/d4c0c858-0632-49da-b6d4-1b2e9f84f690-kube-api-access-sf89w\") pod \"machine-api-operator-5694c8668f-t7nc8\" (UID: \"d4c0c858-0632-49da-b6d4-1b2e9f84f690\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.643405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v86c5\" (UniqueName: \"kubernetes.io/projected/be67fd32-9417-44ea-b20d-0af9897fea35-kube-api-access-v86c5\") pod \"openshift-apiserver-operator-796bbdcf4f-fh49n\" (UID: \"be67fd32-9417-44ea-b20d-0af9897fea35\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.650851 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.658007 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.664610 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9c5m\" (UniqueName: \"kubernetes.io/projected/d8be2d62-1906-4029-91b4-7dd4d2197491-kube-api-access-z9c5m\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.682516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-apiservice-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.682623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-webhook-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.686041 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88t6\" (UniqueName: \"kubernetes.io/projected/c51bb642-174a-4bb3-8a20-1708d490a17d-kube-api-access-w88t6\") pod \"oauth-openshift-558db77b4-svr4m\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.687737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-webhook-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.688178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-apiservice-cert\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.696059 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc954\" (UniqueName: \"kubernetes.io/projected/8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f-kube-api-access-xc954\") pod \"openshift-controller-manager-operator-756b6f6bc6-8df8j\" (UID: \"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.710191 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.711721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8be2d62-1906-4029-91b4-7dd4d2197491-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cwwkb\" (UID: \"d8be2d62-1906-4029-91b4-7dd4d2197491\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.718252 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.726773 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" Mar 18 15:37:36 crc kubenswrapper[4792]: W0318 15:37:36.731310 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a74161a_cf63_45fe_b306_aa335ccd309e.slice/crio-cd7f0c06124c45fe165bfe6a828dbcb56f9e5c5f493f5135326f6848d01807eb WatchSource:0}: Error finding container cd7f0c06124c45fe165bfe6a828dbcb56f9e5c5f493f5135326f6848d01807eb: Status 404 returned error can't find the container with id cd7f0c06124c45fe165bfe6a828dbcb56f9e5c5f493f5135326f6848d01807eb Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.737059 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.738877 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.763291 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.776712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kntgf\" (UniqueName: \"kubernetes.io/projected/089a8476-5fcb-4378-8113-4a7162685b16-kube-api-access-kntgf\") pod \"console-f9d7485db-gs8sw\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.811648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.813025 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.813405 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.817700 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.824037 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" event={"ID":"7a74161a-cf63-45fe-b306-aa335ccd309e","Type":"ContainerStarted","Data":"cd7f0c06124c45fe165bfe6a828dbcb56f9e5c5f493f5135326f6848d01807eb"} Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.830502 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.836954 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.839200 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.858335 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.877610 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.897757 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.917888 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.925044 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n"] Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.939011 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.958023 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 15:37:36 crc kubenswrapper[4792]: I0318 15:37:36.978407 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:36.997894 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.011604 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.011667 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2m75"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.027185 4792 request.go:700] Waited for 1.927393972s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.033273 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.034648 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc323f5d1_220a_41eb_a3db_e56dedfafc29.slice/crio-cbfde79dff05edb763a16a521be5bbbb875f0052c6608fc04d54fe4cd374fbfe WatchSource:0}: Error finding container cbfde79dff05edb763a16a521be5bbbb875f0052c6608fc04d54fe4cd374fbfe: Status 404 returned error can't find the container with id cbfde79dff05edb763a16a521be5bbbb875f0052c6608fc04d54fe4cd374fbfe Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.037732 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.037807 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8be2d62_1906_4029_91b4_7dd4d2197491.slice/crio-990e96de7cc60a759c4546ad67f1e78a61f9621d440ae2cb8db78b76e18be7ce WatchSource:0}: Error finding container 990e96de7cc60a759c4546ad67f1e78a61f9621d440ae2cb8db78b76e18be7ce: Status 404 returned error can't find the container with id 990e96de7cc60a759c4546ad67f1e78a61f9621d440ae2cb8db78b76e18be7ce Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.059685 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.081318 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.100503 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.118500 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.138105 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.189128 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqr6s\" (UniqueName: \"kubernetes.io/projected/c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe-kube-api-access-xqr6s\") pod \"migrator-59844c95c7-ckcd4\" (UID: \"c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.191470 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wmn\" (UniqueName: \"kubernetes.io/projected/b31b4b93-c90d-498b-8136-247c40d9fee2-kube-api-access-r7wmn\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.212182 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82x8h\" (UniqueName: \"kubernetes.io/projected/6a2d17c6-a211-4f41-89a0-a80ece425ed7-kube-api-access-82x8h\") pod \"apiserver-7bbb656c7d-wjwpn\" (UID: \"6a2d17c6-a211-4f41-89a0-a80ece425ed7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.230472 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b31b4b93-c90d-498b-8136-247c40d9fee2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gb4xw\" (UID: \"b31b4b93-c90d-498b-8136-247c40d9fee2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.251615 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f58647de-0442-48c1-9647-53951b40db35-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nsbtg\" (UID: \"f58647de-0442-48c1-9647-53951b40db35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.254814 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.269906 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t7nc8"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.270886 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.272924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nllkp\" (UniqueName: \"kubernetes.io/projected/e0c8715a-4133-49bd-b48f-12377582b8ce-kube-api-access-nllkp\") pod \"openshift-config-operator-7777fb866f-d6vrl\" (UID: \"e0c8715a-4133-49bd-b48f-12377582b8ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.276816 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f3ba3c2_6c8e_46d5_8e01_1ef2c0a97b7f.slice/crio-b9666d25b0909a5198e9177d25d6fcb58da0deb46bb5ae6ea07807ad5a8765b5 WatchSource:0}: Error finding container b9666d25b0909a5198e9177d25d6fcb58da0deb46bb5ae6ea07807ad5a8765b5: Status 404 returned error can't find the container with id b9666d25b0909a5198e9177d25d6fcb58da0deb46bb5ae6ea07807ad5a8765b5 Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.295184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.299668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5hk6\" (UniqueName: \"kubernetes.io/projected/be2f97af-46a9-455f-bc78-9b25a736d1f5-kube-api-access-s5hk6\") pod \"etcd-operator-b45778765-xknnv\" (UID: \"be2f97af-46a9-455f-bc78-9b25a736d1f5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.300806 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.306467 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.315856 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r84w\" (UniqueName: \"kubernetes.io/projected/ac30b157-a927-4aa3-898d-917f8efd8338-kube-api-access-2r84w\") pod \"cluster-samples-operator-665b6dd947-85pml\" (UID: \"ac30b157-a927-4aa3-898d-917f8efd8338\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.318103 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svr4m"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.324889 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gs8sw"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.328827 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9j5pp"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.340115 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrs28\" (UniqueName: \"kubernetes.io/projected/57f5df54-714a-4cce-970a-7069ffd1cb63-kube-api-access-lrs28\") pod \"authentication-operator-69f744f599-pk6tp\" (UID: \"57f5df54-714a-4cce-970a-7069ffd1cb63\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.355837 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gbtq\" (UniqueName: \"kubernetes.io/projected/7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073-kube-api-access-4gbtq\") pod \"packageserver-d55dfcdfc-25vh8\" (UID: \"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.362128 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod089a8476_5fcb_4378_8113_4a7162685b16.slice/crio-fb5a76a2f3e408997aa0b5429402e8411660efb08b645365ee8366c65342a8c9 WatchSource:0}: Error finding container fb5a76a2f3e408997aa0b5429402e8411660efb08b645365ee8366c65342a8c9: Status 404 returned error can't find the container with id fb5a76a2f3e408997aa0b5429402e8411660efb08b645365ee8366c65342a8c9 Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.364981 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd12c69c_a021_4a8a_a5e4_3034aa92f62c.slice/crio-b8796c1eb573fe27ae703226cf35908e48984a0f87f648f457412f2b9f1c21a2 WatchSource:0}: Error finding container b8796c1eb573fe27ae703226cf35908e48984a0f87f648f457412f2b9f1c21a2: Status 404 returned error can't find the container with id b8796c1eb573fe27ae703226cf35908e48984a0f87f648f457412f2b9f1c21a2 Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.367836 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc51bb642_174a_4bb3_8a20_1708d490a17d.slice/crio-468e84f2411aca2940d0735a19f6d50dad846138b633f4d4efcf82fa8a8725ca WatchSource:0}: Error finding container 468e84f2411aca2940d0735a19f6d50dad846138b633f4d4efcf82fa8a8725ca: Status 404 returned error can't find the container with id 468e84f2411aca2940d0735a19f6d50dad846138b633f4d4efcf82fa8a8725ca Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.430792 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.430826 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-serving-cert\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.430852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.430870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdmsx\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-kube-api-access-xdmsx\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.430956 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9b9\" (UniqueName: \"kubernetes.io/projected/a0f79eb1-598c-4d72-af53-a928520fa0d9-kube-api-access-7d9b9\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431016 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/39d1af38-2c59-403e-a97e-2e22ac2737b3-node-pullsecrets\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431148 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-serving-cert\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39d1af38-2c59-403e-a97e-2e22ac2737b3-audit-dir\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431244 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d401c0-bb01-40eb-a81f-cf63a0762747-serving-cert\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431271 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgbx9\" (UniqueName: \"kubernetes.io/projected/01af365e-5f9a-4030-b54e-ebee4cf39552-kube-api-access-xgbx9\") pod \"downloads-7954f5f757-ltvvk\" (UID: \"01af365e-5f9a-4030-b54e-ebee4cf39552\") " pod="openshift-console/downloads-7954f5f757-ltvvk" Mar 18 15:37:37 crc kubenswrapper[4792]: E0318 15:37:37.431287 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:37.931271488 +0000 UTC m=+206.800600535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-audit\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431347 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-etcd-serving-ca\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431367 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431396 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106c799a-83d0-4815-ab5a-61c2b67b86f7-trusted-ca\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-config\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431451 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f79eb1-598c-4d72-af53-a928520fa0d9-service-ca-bundle\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431579 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-bound-sa-token\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431625 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6813d755-7c94-40a7-a07c-95a2073a47fb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431643 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-proxy-tls\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431699 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c799a-83d0-4815-ab5a-61c2b67b86f7-config\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431715 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-etcd-client\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431759 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpd69\" (UniqueName: \"kubernetes.io/projected/39d1af38-2c59-403e-a97e-2e22ac2737b3-kube-api-access-vpd69\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-client-ca\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431847 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-config\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-trusted-ca\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431908 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmdz\" (UniqueName: \"kubernetes.io/projected/79d401c0-bb01-40eb-a81f-cf63a0762747-kube-api-access-wgmdz\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.431947 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-tls\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432021 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432056 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-certificates\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432110 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106c799a-83d0-4815-ab5a-61c2b67b86f7-serving-cert\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432136 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b93393b-f935-45f2-9ad1-8a119230b1fa-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgcb5\" (UniqueName: \"kubernetes.io/projected/106c799a-83d0-4815-ab5a-61c2b67b86f7-kube-api-access-pgcb5\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6813d755-7c94-40a7-a07c-95a2073a47fb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-image-import-ca\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432309 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-images\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-stats-auth\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432392 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-encryption-config\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432425 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-default-certificate\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432448 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjzr6\" (UniqueName: \"kubernetes.io/projected/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-kube-api-access-gjzr6\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432492 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4xfj\" (UniqueName: \"kubernetes.io/projected/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-kube-api-access-s4xfj\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432520 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432540 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdcn\" (UniqueName: \"kubernetes.io/projected/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-kube-api-access-wrdcn\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432560 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b93393b-f935-45f2-9ad1-8a119230b1fa-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-metrics-certs\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432598 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6813d755-7c94-40a7-a07c-95a2073a47fb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-config\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432636 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-config\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.432653 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.447813 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.464647 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.500729 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.507928 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533333 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533503 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533677 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtshf\" (UniqueName: \"kubernetes.io/projected/63a36577-fe9f-43d6-b1d9-5c918e1161b2-kube-api-access-wtshf\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533705 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x749\" (UniqueName: \"kubernetes.io/projected/52f8e1dd-171e-474f-b424-e879a4d73a5e-kube-api-access-2x749\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-default-certificate\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533804 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjzr6\" (UniqueName: \"kubernetes.io/projected/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-kube-api-access-gjzr6\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsq9x\" (UniqueName: \"kubernetes.io/projected/4a397c1a-6373-41ad-b12c-c56ff3afbff0-kube-api-access-dsq9x\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533861 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xfj\" (UniqueName: \"kubernetes.io/projected/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-kube-api-access-s4xfj\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533899 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533923 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-socket-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.533958 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534007 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52f8e1dd-171e-474f-b424-e879a4d73a5e-signing-cabundle\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss8x4\" (UniqueName: \"kubernetes.io/projected/3175cda1-84cb-469d-9f49-028132d324ea-kube-api-access-ss8x4\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdcn\" (UniqueName: \"kubernetes.io/projected/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-kube-api-access-wrdcn\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534075 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-config-volume\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b93393b-f935-45f2-9ad1-8a119230b1fa-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534125 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-metrics-certs\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534144 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6813d755-7c94-40a7-a07c-95a2073a47fb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534191 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-mountpoint-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534212 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f75c6f2b-0965-4e4d-9445-dc65d69c970b-profile-collector-cert\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534234 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-config\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-config\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534283 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534385 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-serving-cert\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534415 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-serving-cert\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp6jb\" (UniqueName: \"kubernetes.io/projected/f75c6f2b-0965-4e4d-9445-dc65d69c970b-kube-api-access-zp6jb\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdmsx\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-kube-api-access-xdmsx\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9b9\" (UniqueName: \"kubernetes.io/projected/a0f79eb1-598c-4d72-af53-a928520fa0d9-kube-api-access-7d9b9\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/39d1af38-2c59-403e-a97e-2e22ac2737b3-node-pullsecrets\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534513 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phk44\" (UniqueName: \"kubernetes.io/projected/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-kube-api-access-phk44\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534529 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-csi-data-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0f66845-303e-42cf-b091-be0ac57cba20-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zcnn4\" (UID: \"b0f66845-303e-42cf-b091-be0ac57cba20\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534575 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgbx9\" (UniqueName: \"kubernetes.io/projected/01af365e-5f9a-4030-b54e-ebee4cf39552-kube-api-access-xgbx9\") pod \"downloads-7954f5f757-ltvvk\" (UID: \"01af365e-5f9a-4030-b54e-ebee4cf39552\") " pod="openshift-console/downloads-7954f5f757-ltvvk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534589 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39d1af38-2c59-403e-a97e-2e22ac2737b3-audit-dir\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d401c0-bb01-40eb-a81f-cf63a0762747-serving-cert\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-audit\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534638 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-etcd-serving-ca\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534654 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106c799a-83d0-4815-ab5a-61c2b67b86f7-trusted-ca\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534713 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-config\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfb00e07-7b51-4687-8cf9-6e285f133f5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrqhk\" (UID: \"bfb00e07-7b51-4687-8cf9-6e285f133f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f79eb1-598c-4d72-af53-a928520fa0d9-service-ca-bundle\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shkc7\" (UniqueName: \"kubernetes.io/projected/6a509c1f-d106-4f35-9226-58a6779b738b-kube-api-access-shkc7\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534813 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprw7\" (UniqueName: \"kubernetes.io/projected/d93d3069-e4d6-4291-9ef0-d07cba34401c-kube-api-access-qprw7\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534833 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87szr\" (UniqueName: \"kubernetes.io/projected/2c77956c-88bf-4e94-a8de-a41728753ccd-kube-api-access-87szr\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534893 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a509c1f-d106-4f35-9226-58a6779b738b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: E0318 15:37:37.534908 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.034892688 +0000 UTC m=+206.904221625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.534956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-bound-sa-token\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535001 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f75c6f2b-0965-4e4d-9445-dc65d69c970b-srv-cert\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sct6n\" (UniqueName: \"kubernetes.io/projected/e094f66b-fe57-429a-b5cd-de6084a8aacb-kube-api-access-sct6n\") pod \"auto-csr-approver-29564136-wt69c\" (UID: \"e094f66b-fe57-429a-b5cd-de6084a8aacb\") " pod="openshift-infra/auto-csr-approver-29564136-wt69c" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6813d755-7c94-40a7-a07c-95a2073a47fb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535069 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-proxy-tls\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535085 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c77956c-88bf-4e94-a8de-a41728753ccd-config-volume\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535136 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvxs\" (UniqueName: \"kubernetes.io/projected/bfb00e07-7b51-4687-8cf9-6e285f133f5b-kube-api-access-nvvxs\") pod \"multus-admission-controller-857f4d67dd-jrqhk\" (UID: \"bfb00e07-7b51-4687-8cf9-6e285f133f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535184 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c799a-83d0-4815-ab5a-61c2b67b86f7-config\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535224 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-etcd-client\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpd69\" (UniqueName: \"kubernetes.io/projected/39d1af38-2c59-403e-a97e-2e22ac2737b3-kube-api-access-vpd69\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535313 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-client-ca\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63a36577-fe9f-43d6-b1d9-5c918e1161b2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535364 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxb9\" (UniqueName: \"kubernetes.io/projected/7d216655-c83a-4f17-9e9a-367579911a35-kube-api-access-vbxb9\") pod \"package-server-manager-789f6589d5-2jfcp\" (UID: \"7d216655-c83a-4f17-9e9a-367579911a35\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d93d3069-e4d6-4291-9ef0-d07cba34401c-node-bootstrap-token\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-config\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-trusted-ca\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmdz\" (UniqueName: \"kubernetes.io/projected/79d401c0-bb01-40eb-a81f-cf63a0762747-kube-api-access-wgmdz\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk8pg\" (UniqueName: \"kubernetes.io/projected/b0f66845-303e-42cf-b091-be0ac57cba20-kube-api-access-jk8pg\") pod \"control-plane-machine-set-operator-78cbb6b69f-zcnn4\" (UID: \"b0f66845-303e-42cf-b091-be0ac57cba20\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535572 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c77956c-88bf-4e94-a8de-a41728753ccd-secret-volume\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-tls\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535646 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9zbg\" (UniqueName: \"kubernetes.io/projected/57f057e4-26f7-4d1f-92bb-18886619837c-kube-api-access-c9zbg\") pod \"ingress-canary-b6vrp\" (UID: \"57f057e4-26f7-4d1f-92bb-18886619837c\") " pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-certificates\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535711 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a509c1f-d106-4f35-9226-58a6779b738b-srv-cert\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535746 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d93d3069-e4d6-4291-9ef0-d07cba34401c-certs\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52f8e1dd-171e-474f-b424-e879a4d73a5e-signing-key\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535790 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-metrics-tls\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535817 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106c799a-83d0-4815-ab5a-61c2b67b86f7-serving-cert\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.535843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d216655-c83a-4f17-9e9a-367579911a35-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2jfcp\" (UID: \"7d216655-c83a-4f17-9e9a-367579911a35\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.536331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6813d755-7c94-40a7-a07c-95a2073a47fb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.536729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b93393b-f935-45f2-9ad1-8a119230b1fa-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.538588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-trusted-ca\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539166 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/39d1af38-2c59-403e-a97e-2e22ac2737b3-node-pullsecrets\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b93393b-f935-45f2-9ad1-8a119230b1fa-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63a36577-fe9f-43d6-b1d9-5c918e1161b2-proxy-tls\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539598 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-registration-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-plugins-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539698 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-image-import-ca\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgcb5\" (UniqueName: \"kubernetes.io/projected/106c799a-83d0-4815-ab5a-61c2b67b86f7-kube-api-access-pgcb5\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539820 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6813d755-7c94-40a7-a07c-95a2073a47fb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-images\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.539929 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f057e4-26f7-4d1f-92bb-18886619837c-cert\") pod \"ingress-canary-b6vrp\" (UID: \"57f057e4-26f7-4d1f-92bb-18886619837c\") " pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.540015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-stats-auth\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.540072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-encryption-config\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.540742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-config\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.540876 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.541282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39d1af38-2c59-403e-a97e-2e22ac2737b3-audit-dir\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.541461 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-client-ca\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.541614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-config\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.542633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f79eb1-598c-4d72-af53-a928520fa0d9-service-ca-bundle\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.542797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-config\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.543493 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-audit\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.543622 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-etcd-serving-ca\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.544315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.546205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-certificates\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.547426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/39d1af38-2c59-403e-a97e-2e22ac2737b3-image-import-ca\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.551958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-images\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.552076 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c799a-83d0-4815-ab5a-61c2b67b86f7-config\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.552138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.552594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b93393b-f935-45f2-9ad1-8a119230b1fa-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.552969 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d401c0-bb01-40eb-a81f-cf63a0762747-serving-cert\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.553251 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/106c799a-83d0-4815-ab5a-61c2b67b86f7-trusted-ca\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.553321 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-config\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.553888 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106c799a-83d0-4815-ab5a-61c2b67b86f7-serving-cert\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.554262 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-serving-cert\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.554547 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.555133 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-serving-cert\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.556099 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-etcd-client\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.557961 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-metrics-certs\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.558329 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-tls\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.558579 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-proxy-tls\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.553890 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6813d755-7c94-40a7-a07c-95a2073a47fb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.565810 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.566448 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-default-certificate\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.566969 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.568786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0f79eb1-598c-4d72-af53-a928520fa0d9-stats-auth\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.583887 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/39d1af38-2c59-403e-a97e-2e22ac2737b3-encryption-config\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.592726 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4xfj\" (UniqueName: \"kubernetes.io/projected/b9a5245d-b1d7-4826-8f4b-7dadfbae4263-kube-api-access-s4xfj\") pod \"machine-config-operator-74547568cd-h6rvj\" (UID: \"b9a5245d-b1d7-4826-8f4b-7dadfbae4263\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.595604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmdz\" (UniqueName: \"kubernetes.io/projected/79d401c0-bb01-40eb-a81f-cf63a0762747-kube-api-access-wgmdz\") pod \"route-controller-manager-6576b87f9c-9j26r\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.605398 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf58647de_0442_48c1_9647_53951b40db35.slice/crio-bc0b5f5ca255ac7dd678eff83f7693f35aa543d6ec9b4029b64755a18e96c499 WatchSource:0}: Error finding container bc0b5f5ca255ac7dd678eff83f7693f35aa543d6ec9b4029b64755a18e96c499: Status 404 returned error can't find the container with id bc0b5f5ca255ac7dd678eff83f7693f35aa543d6ec9b4029b64755a18e96c499 Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.637181 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpd69\" (UniqueName: \"kubernetes.io/projected/39d1af38-2c59-403e-a97e-2e22ac2737b3-kube-api-access-vpd69\") pod \"apiserver-76f77b778f-dkkhx\" (UID: \"39d1af38-2c59-403e-a97e-2e22ac2737b3\") " pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.637686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f75c6f2b-0965-4e4d-9445-dc65d69c970b-srv-cert\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sct6n\" (UniqueName: \"kubernetes.io/projected/e094f66b-fe57-429a-b5cd-de6084a8aacb-kube-api-access-sct6n\") pod \"auto-csr-approver-29564136-wt69c\" (UID: \"e094f66b-fe57-429a-b5cd-de6084a8aacb\") " pod="openshift-infra/auto-csr-approver-29564136-wt69c" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c77956c-88bf-4e94-a8de-a41728753ccd-config-volume\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641117 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvxs\" (UniqueName: \"kubernetes.io/projected/bfb00e07-7b51-4687-8cf9-6e285f133f5b-kube-api-access-nvvxs\") pod \"multus-admission-controller-857f4d67dd-jrqhk\" (UID: \"bfb00e07-7b51-4687-8cf9-6e285f133f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63a36577-fe9f-43d6-b1d9-5c918e1161b2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxb9\" (UniqueName: \"kubernetes.io/projected/7d216655-c83a-4f17-9e9a-367579911a35-kube-api-access-vbxb9\") pod \"package-server-manager-789f6589d5-2jfcp\" (UID: \"7d216655-c83a-4f17-9e9a-367579911a35\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641212 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d93d3069-e4d6-4291-9ef0-d07cba34401c-node-bootstrap-token\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk8pg\" (UniqueName: \"kubernetes.io/projected/b0f66845-303e-42cf-b091-be0ac57cba20-kube-api-access-jk8pg\") pod \"control-plane-machine-set-operator-78cbb6b69f-zcnn4\" (UID: \"b0f66845-303e-42cf-b091-be0ac57cba20\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641251 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c77956c-88bf-4e94-a8de-a41728753ccd-secret-volume\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641273 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9zbg\" (UniqueName: \"kubernetes.io/projected/57f057e4-26f7-4d1f-92bb-18886619837c-kube-api-access-c9zbg\") pod \"ingress-canary-b6vrp\" (UID: \"57f057e4-26f7-4d1f-92bb-18886619837c\") " pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641294 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a509c1f-d106-4f35-9226-58a6779b738b-srv-cert\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d93d3069-e4d6-4291-9ef0-d07cba34401c-certs\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641332 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52f8e1dd-171e-474f-b424-e879a4d73a5e-signing-key\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641352 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-metrics-tls\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641374 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d216655-c83a-4f17-9e9a-367579911a35-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2jfcp\" (UID: \"7d216655-c83a-4f17-9e9a-367579911a35\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641395 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63a36577-fe9f-43d6-b1d9-5c918e1161b2-proxy-tls\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-registration-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-plugins-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641457 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f057e4-26f7-4d1f-92bb-18886619837c-cert\") pod \"ingress-canary-b6vrp\" (UID: \"57f057e4-26f7-4d1f-92bb-18886619837c\") " pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641487 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtshf\" (UniqueName: \"kubernetes.io/projected/63a36577-fe9f-43d6-b1d9-5c918e1161b2-kube-api-access-wtshf\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x749\" (UniqueName: \"kubernetes.io/projected/52f8e1dd-171e-474f-b424-e879a4d73a5e-kube-api-access-2x749\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsq9x\" (UniqueName: \"kubernetes.io/projected/4a397c1a-6373-41ad-b12c-c56ff3afbff0-kube-api-access-dsq9x\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-socket-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52f8e1dd-171e-474f-b424-e879a4d73a5e-signing-cabundle\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641604 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss8x4\" (UniqueName: \"kubernetes.io/projected/3175cda1-84cb-469d-9f49-028132d324ea-kube-api-access-ss8x4\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641628 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-config-volume\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641650 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-mountpoint-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641668 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f75c6f2b-0965-4e4d-9445-dc65d69c970b-profile-collector-cert\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641704 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp6jb\" (UniqueName: \"kubernetes.io/projected/f75c6f2b-0965-4e4d-9445-dc65d69c970b-kube-api-access-zp6jb\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phk44\" (UniqueName: \"kubernetes.io/projected/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-kube-api-access-phk44\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-csi-data-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0f66845-303e-42cf-b091-be0ac57cba20-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zcnn4\" (UID: \"b0f66845-303e-42cf-b091-be0ac57cba20\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641832 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfb00e07-7b51-4687-8cf9-6e285f133f5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrqhk\" (UID: \"bfb00e07-7b51-4687-8cf9-6e285f133f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641875 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shkc7\" (UniqueName: \"kubernetes.io/projected/6a509c1f-d106-4f35-9226-58a6779b738b-kube-api-access-shkc7\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qprw7\" (UniqueName: \"kubernetes.io/projected/d93d3069-e4d6-4291-9ef0-d07cba34401c-kube-api-access-qprw7\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641922 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87szr\" (UniqueName: \"kubernetes.io/projected/2c77956c-88bf-4e94-a8de-a41728753ccd-kube-api-access-87szr\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.641944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a509c1f-d106-4f35-9226-58a6779b738b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: E0318 15:37:37.642418 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.142398621 +0000 UTC m=+207.011727558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.644160 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-registration-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.645224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-config-volume\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.645325 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-socket-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.645359 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c77956c-88bf-4e94-a8de-a41728753ccd-config-volume\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.646355 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.646437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-mountpoint-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.646481 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52f8e1dd-171e-474f-b424-e879a4d73a5e-signing-cabundle\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.647042 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-plugins-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.647420 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3175cda1-84cb-469d-9f49-028132d324ea-csi-data-dir\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.650685 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.651344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a509c1f-d106-4f35-9226-58a6779b738b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.651450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f75c6f2b-0965-4e4d-9445-dc65d69c970b-srv-cert\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.651522 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0f66845-303e-42cf-b091-be0ac57cba20-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zcnn4\" (UID: \"b0f66845-303e-42cf-b091-be0ac57cba20\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.651832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfb00e07-7b51-4687-8cf9-6e285f133f5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrqhk\" (UID: \"bfb00e07-7b51-4687-8cf9-6e285f133f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.652756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63a36577-fe9f-43d6-b1d9-5c918e1161b2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.653485 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a509c1f-d106-4f35-9226-58a6779b738b-srv-cert\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.653609 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f057e4-26f7-4d1f-92bb-18886619837c-cert\") pod \"ingress-canary-b6vrp\" (UID: \"57f057e4-26f7-4d1f-92bb-18886619837c\") " pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.654376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f75c6f2b-0965-4e4d-9445-dc65d69c970b-profile-collector-cert\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.655147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63a36577-fe9f-43d6-b1d9-5c918e1161b2-proxy-tls\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.655399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.656058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d216655-c83a-4f17-9e9a-367579911a35-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2jfcp\" (UID: \"7d216655-c83a-4f17-9e9a-367579911a35\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.657199 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-bound-sa-token\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.658071 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52f8e1dd-171e-474f-b424-e879a4d73a5e-signing-key\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.659325 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d93d3069-e4d6-4291-9ef0-d07cba34401c-node-bootstrap-token\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.660332 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c77956c-88bf-4e94-a8de-a41728753ccd-secret-volume\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.663103 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d93d3069-e4d6-4291-9ef0-d07cba34401c-certs\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.667310 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-metrics-tls\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.685698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgbx9\" (UniqueName: \"kubernetes.io/projected/01af365e-5f9a-4030-b54e-ebee4cf39552-kube-api-access-xgbx9\") pod \"downloads-7954f5f757-ltvvk\" (UID: \"01af365e-5f9a-4030-b54e-ebee4cf39552\") " pod="openshift-console/downloads-7954f5f757-ltvvk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.692319 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6813d755-7c94-40a7-a07c-95a2073a47fb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mrbj7\" (UID: \"6813d755-7c94-40a7-a07c-95a2073a47fb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.720564 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qbhrj\" (UID: \"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.741522 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjzr6\" (UniqueName: \"kubernetes.io/projected/0ca9e0b0-a082-4128-a24a-846d0fc5e7a5-kube-api-access-gjzr6\") pod \"kube-storage-version-migrator-operator-b67b599dd-ql7f8\" (UID: \"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.742346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.742608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.742648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.742677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:37 crc kubenswrapper[4792]: E0318 15:37:37.743246 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.243229069 +0000 UTC m=+207.112558016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.743906 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.751648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.751888 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.757617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgcb5\" (UniqueName: \"kubernetes.io/projected/106c799a-83d0-4815-ab5a-61c2b67b86f7-kube-api-access-pgcb5\") pod \"console-operator-58897d9998-bt6v4\" (UID: \"106c799a-83d0-4815-ab5a-61c2b67b86f7\") " pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.787517 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.787807 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdcn\" (UniqueName: \"kubernetes.io/projected/f43e9d58-ee06-48d9-bed5-6b2e039d0a3a-kube-api-access-wrdcn\") pod \"service-ca-operator-777779d784-cjxlb\" (UID: \"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.792534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.799460 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdmsx\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-kube-api-access-xdmsx\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.818535 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.824651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9b9\" (UniqueName: \"kubernetes.io/projected/a0f79eb1-598c-4d72-af53-a928520fa0d9-kube-api-access-7d9b9\") pod \"router-default-5444994796-tfth7\" (UID: \"a0f79eb1-598c-4d72-af53-a928520fa0d9\") " pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.833234 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" event={"ID":"d4c0c858-0632-49da-b6d4-1b2e9f84f690","Type":"ContainerStarted","Data":"2b1840adbbf20c3f368c45672ed79f6848d91749ef974239039e493e6802845a"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.833272 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" event={"ID":"d4c0c858-0632-49da-b6d4-1b2e9f84f690","Type":"ContainerStarted","Data":"bf7abd9d576a2f3a2e021bf0c9506bc6336cd404214797497580810606bb864d"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.833282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" event={"ID":"d4c0c858-0632-49da-b6d4-1b2e9f84f690","Type":"ContainerStarted","Data":"21f4d6904e35083d208f169cf9d0d5cda227945a6aa425b4b99cdbe442ec894a"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.835353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" event={"ID":"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f","Type":"ContainerStarted","Data":"0d5463d5c5ffdfd35adaf101bd0dc1a6bb71f4f7b9edffc44f37b4825088083b"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.835378 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" event={"ID":"8f3ba3c2-6c8e-46d5-8e01-1ef2c0a97b7f","Type":"ContainerStarted","Data":"b9666d25b0909a5198e9177d25d6fcb58da0deb46bb5ae6ea07807ad5a8765b5"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.836125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.840628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gs8sw" event={"ID":"089a8476-5fcb-4378-8113-4a7162685b16","Type":"ContainerStarted","Data":"2ab6db78b289f688262fd75b87e727ca0bf2093cdb8a5376f333838d668cf575"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.840668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gs8sw" event={"ID":"089a8476-5fcb-4378-8113-4a7162685b16","Type":"ContainerStarted","Data":"fb5a76a2f3e408997aa0b5429402e8411660efb08b645365ee8366c65342a8c9"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.841174 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ltvvk" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.843602 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.843756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: E0318 15:37:37.844054 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.344042776 +0000 UTC m=+207.213371713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.849134 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.849270 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.885945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtshf\" (UniqueName: \"kubernetes.io/projected/63a36577-fe9f-43d6-b1d9-5c918e1161b2-kube-api-access-wtshf\") pod \"machine-config-controller-84d6567774-vtf8t\" (UID: \"63a36577-fe9f-43d6-b1d9-5c918e1161b2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.886216 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.904607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shkc7\" (UniqueName: \"kubernetes.io/projected/6a509c1f-d106-4f35-9226-58a6779b738b-kube-api-access-shkc7\") pod \"olm-operator-6b444d44fb-mm4h4\" (UID: \"6a509c1f-d106-4f35-9226-58a6779b738b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.911826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sct6n\" (UniqueName: \"kubernetes.io/projected/e094f66b-fe57-429a-b5cd-de6084a8aacb-kube-api-access-sct6n\") pod \"auto-csr-approver-29564136-wt69c\" (UID: \"e094f66b-fe57-429a-b5cd-de6084a8aacb\") " pod="openshift-infra/auto-csr-approver-29564136-wt69c" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.914189 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.914971 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" event={"ID":"d8be2d62-1906-4029-91b4-7dd4d2197491","Type":"ContainerStarted","Data":"c791c085425f9bccd1495a16ace2ce7fd76bb1aaa665d2209c11a34e872f2688"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.915022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" event={"ID":"d8be2d62-1906-4029-91b4-7dd4d2197491","Type":"ContainerStarted","Data":"990e96de7cc60a759c4546ad67f1e78a61f9621d440ae2cb8db78b76e18be7ce"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.917865 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x749\" (UniqueName: \"kubernetes.io/projected/52f8e1dd-171e-474f-b424-e879a4d73a5e-kube-api-access-2x749\") pod \"service-ca-9c57cc56f-lt4p5\" (UID: \"52f8e1dd-171e-474f-b424-e879a4d73a5e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.922358 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.926132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.930158 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.931357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" event={"ID":"be67fd32-9417-44ea-b20d-0af9897fea35","Type":"ContainerStarted","Data":"d542e8e11ddd5d3b1a204e21368aeeb8c826acef396b6ea99300b41bc758f42b"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.931403 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" event={"ID":"be67fd32-9417-44ea-b20d-0af9897fea35","Type":"ContainerStarted","Data":"9f904a0a3148dc8aff6ef7bae9e7b09ffa68f65d5c752a3b5ad89c598104ee4f"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.935491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" event={"ID":"6a2d17c6-a211-4f41-89a0-a80ece425ed7","Type":"ContainerStarted","Data":"dbda692c6666d8a67d426b6ad244ddacdea35846b0c0a1246fa870e53ee3f6ff"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.935774 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.938632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsq9x\" (UniqueName: \"kubernetes.io/projected/4a397c1a-6373-41ad-b12c-c56ff3afbff0-kube-api-access-dsq9x\") pod \"marketplace-operator-79b997595-l8284\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.947657 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.947778 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:37 crc kubenswrapper[4792]: E0318 15:37:37.948065 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.448044439 +0000 UTC m=+207.317373376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.948288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:37 crc kubenswrapper[4792]: E0318 15:37:37.949262 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.449248384 +0000 UTC m=+207.318577321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.949404 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" event={"ID":"dd12c69c-a021-4a8a-a5e4-3034aa92f62c","Type":"ContainerStarted","Data":"7cb5585a1423779d5a6bc2776fc424ef46e55b86b2e8d095d83814ee885f00af"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.949435 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" event={"ID":"dd12c69c-a021-4a8a-a5e4-3034aa92f62c","Type":"ContainerStarted","Data":"b8796c1eb573fe27ae703226cf35908e48984a0f87f648f457412f2b9f1c21a2"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.954444 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phk44\" (UniqueName: \"kubernetes.io/projected/51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5-kube-api-access-phk44\") pod \"dns-default-h4rq9\" (UID: \"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5\") " pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.960207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" event={"ID":"c51bb642-174a-4bb3-8a20-1708d490a17d","Type":"ContainerStarted","Data":"5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.960266 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xknnv"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.960286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" event={"ID":"c51bb642-174a-4bb3-8a20-1708d490a17d","Type":"ContainerStarted","Data":"468e84f2411aca2940d0735a19f6d50dad846138b633f4d4efcf82fa8a8725ca"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.961118 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.964404 4792 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-svr4m container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.964444 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.965834 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl"] Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.966102 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.968661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" event={"ID":"f58647de-0442-48c1-9647-53951b40db35","Type":"ContainerStarted","Data":"bc0b5f5ca255ac7dd678eff83f7693f35aa543d6ec9b4029b64755a18e96c499"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.973141 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.973832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp6jb\" (UniqueName: \"kubernetes.io/projected/f75c6f2b-0965-4e4d-9445-dc65d69c970b-kube-api-access-zp6jb\") pod \"catalog-operator-68c6474976-xmfwt\" (UID: \"f75c6f2b-0965-4e4d-9445-dc65d69c970b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.974605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" event={"ID":"7a74161a-cf63-45fe-b306-aa335ccd309e","Type":"ContainerStarted","Data":"5feadce2e4aa32dd3120da6fd9f413c5fc4a186c2b4de29fda6cce59d63598c6"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.974641 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" event={"ID":"7a74161a-cf63-45fe-b306-aa335ccd309e","Type":"ContainerStarted","Data":"072f8ed3585c47acda04b44eacd6ec1a9c89ad44048c9c37e7b5a38dcf619cb4"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.977487 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" event={"ID":"c323f5d1-220a-41eb-a3db-e56dedfafc29","Type":"ContainerStarted","Data":"78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.977649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" event={"ID":"c323f5d1-220a-41eb-a3db-e56dedfafc29","Type":"ContainerStarted","Data":"cbfde79dff05edb763a16a521be5bbbb875f0052c6608fc04d54fe4cd374fbfe"} Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.978525 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.980116 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:37 crc kubenswrapper[4792]: W0318 15:37:37.980328 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ff6f1e_85ce_4aa9_b3ac_73b0d78abcbe.slice/crio-e15acf68281161b4dd4a5572b6334c49d9926cb32464090ad7fef266bed8d5d5 WatchSource:0}: Error finding container e15acf68281161b4dd4a5572b6334c49d9926cb32464090ad7fef266bed8d5d5: Status 404 returned error can't find the container with id e15acf68281161b4dd4a5572b6334c49d9926cb32464090ad7fef266bed8d5d5 Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.982215 4792 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x2m75 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.982267 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" podUID="c323f5d1-220a-41eb-a3db-e56dedfafc29" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 15:37:37 crc kubenswrapper[4792]: I0318 15:37:37.989323 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.005747 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.006012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprw7\" (UniqueName: \"kubernetes.io/projected/d93d3069-e4d6-4291-9ef0-d07cba34401c-kube-api-access-qprw7\") pod \"machine-config-server-45c2x\" (UID: \"d93d3069-e4d6-4291-9ef0-d07cba34401c\") " pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.015826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87szr\" (UniqueName: \"kubernetes.io/projected/2c77956c-88bf-4e94-a8de-a41728753ccd-kube-api-access-87szr\") pod \"collect-profiles-29564130-zxv4v\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.021060 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.026970 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml"] Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.035085 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss8x4\" (UniqueName: \"kubernetes.io/projected/3175cda1-84cb-469d-9f49-028132d324ea-kube-api-access-ss8x4\") pod \"csi-hostpathplugin-qw2mp\" (UID: \"3175cda1-84cb-469d-9f49-028132d324ea\") " pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.041288 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564136-wt69c" Mar 18 15:37:38 crc kubenswrapper[4792]: W0318 15:37:38.045905 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe2f97af_46a9_455f_bc78_9b25a736d1f5.slice/crio-f7a54d7ae01c6b88c3269d4216662471f64202ed28961a258463325dd82b1b28 WatchSource:0}: Error finding container f7a54d7ae01c6b88c3269d4216662471f64202ed28961a258463325dd82b1b28: Status 404 returned error can't find the container with id f7a54d7ae01c6b88c3269d4216662471f64202ed28961a258463325dd82b1b28 Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.049961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.050126 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.550099992 +0000 UTC m=+207.419428929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.054823 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.057149 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.55713486 +0000 UTC m=+207.426463797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.058320 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9zbg\" (UniqueName: \"kubernetes.io/projected/57f057e4-26f7-4d1f-92bb-18886619837c-kube-api-access-c9zbg\") pod \"ingress-canary-b6vrp\" (UID: \"57f057e4-26f7-4d1f-92bb-18886619837c\") " pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.063833 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-45c2x" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.073393 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.099415 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.101382 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.108272 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvxs\" (UniqueName: \"kubernetes.io/projected/bfb00e07-7b51-4687-8cf9-6e285f133f5b-kube-api-access-nvvxs\") pod \"multus-admission-controller-857f4d67dd-jrqhk\" (UID: \"bfb00e07-7b51-4687-8cf9-6e285f133f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.119186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxb9\" (UniqueName: \"kubernetes.io/projected/7d216655-c83a-4f17-9e9a-367579911a35-kube-api-access-vbxb9\") pod \"package-server-manager-789f6589d5-2jfcp\" (UID: \"7d216655-c83a-4f17-9e9a-367579911a35\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.128205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk8pg\" (UniqueName: \"kubernetes.io/projected/b0f66845-303e-42cf-b091-be0ac57cba20-kube-api-access-jk8pg\") pod \"control-plane-machine-set-operator-78cbb6b69f-zcnn4\" (UID: \"b0f66845-303e-42cf-b091-be0ac57cba20\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:38 crc kubenswrapper[4792]: W0318 15:37:38.157561 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f79eb1_598c_4d72_af53_a928520fa0d9.slice/crio-ba4292110bbcecdac64e16c295bc9d4c5f38c36d54217bfccee480aa5b5d3a21 WatchSource:0}: Error finding container ba4292110bbcecdac64e16c295bc9d4c5f38c36d54217bfccee480aa5b5d3a21: Status 404 returned error can't find the container with id ba4292110bbcecdac64e16c295bc9d4c5f38c36d54217bfccee480aa5b5d3a21 Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.176009 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.176303 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.67628972 +0000 UTC m=+207.545618657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.277719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.278320 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.778308741 +0000 UTC m=+207.647637678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.294738 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.303639 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8"] Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.307961 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj"] Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.311723 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.320540 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pk6tp"] Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.347947 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.355440 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b6vrp" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.379683 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.380095 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.880040302 +0000 UTC m=+207.749369239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.380255 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.380761 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.880753118 +0000 UTC m=+207.750082056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.487034 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.487161 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.98713914 +0000 UTC m=+207.856468077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.487450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.487802 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:38.987788214 +0000 UTC m=+207.857117141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.589308 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.589623 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.089603517 +0000 UTC m=+207.958932454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.590090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.590521 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.090507611 +0000 UTC m=+207.959836548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.628027 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7"] Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.629277 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dkkhx"] Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.694741 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.695444 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.195422838 +0000 UTC m=+208.064751775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.743808 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48188: no serving certificate available for the kubelet" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.786387 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj"] Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.801799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.802162 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.302144852 +0000 UTC m=+208.171473789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.847997 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48200: no serving certificate available for the kubelet" Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.904242 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:38 crc kubenswrapper[4792]: E0318 15:37:38.904595 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.404579719 +0000 UTC m=+208.273908656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:38 crc kubenswrapper[4792]: I0318 15:37:38.979683 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48212: no serving certificate available for the kubelet" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.005396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.005669 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.505657326 +0000 UTC m=+208.374986263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.034288 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" event={"ID":"e0c8715a-4133-49bd-b48f-12377582b8ce","Type":"ContainerStarted","Data":"0b592d76ba3a5903e90b7d128911f844fec0fcda2ea6adea259b425f459af69a"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.034349 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" event={"ID":"e0c8715a-4133-49bd-b48f-12377582b8ce","Type":"ContainerStarted","Data":"d8609b82542506d582dd3318088556790bf0eac1c7230f524906343149e0f20b"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.043030 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" event={"ID":"b9a5245d-b1d7-4826-8f4b-7dadfbae4263","Type":"ContainerStarted","Data":"95aff243c1e07f1b2114864c0d4d70cc63a9b7098f487652d8e5dfe7ec1e5ea3"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.048175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-45c2x" event={"ID":"d93d3069-e4d6-4291-9ef0-d07cba34401c","Type":"ContainerStarted","Data":"2081a87acfc99e3e429885fcf8ea0a0904797eac64d65cf823b2d7210219159c"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.054653 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" event={"ID":"39d1af38-2c59-403e-a97e-2e22ac2737b3","Type":"ContainerStarted","Data":"946314b149f8660d5e3a916e0f68bff8dd9f645491cda89b39b10f9e0f28b2dd"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.063494 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48218: no serving certificate available for the kubelet" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.108712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.109016 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.609001966 +0000 UTC m=+208.478330903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.109289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5bb56746b1a556af9bc6623a9c5ac478c1b0c86ca81e297f2dba9b96323e48cb"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.125115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tfth7" event={"ID":"a0f79eb1-598c-4d72-af53-a928520fa0d9","Type":"ContainerStarted","Data":"ba4292110bbcecdac64e16c295bc9d4c5f38c36d54217bfccee480aa5b5d3a21"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.159194 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" event={"ID":"ac30b157-a927-4aa3-898d-917f8efd8338","Type":"ContainerStarted","Data":"ee0bd6d31fbeae687c5995c0845f6e28cf37666a4eab3ff8abcf4b7dd116162c"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.166627 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48224: no serving certificate available for the kubelet" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.191469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" event={"ID":"be2f97af-46a9-455f-bc78-9b25a736d1f5","Type":"ContainerStarted","Data":"f7a54d7ae01c6b88c3269d4216662471f64202ed28961a258463325dd82b1b28"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.211721 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.212389 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.712372487 +0000 UTC m=+208.581701414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.220061 4792 generic.go:334] "Generic (PLEG): container finished" podID="6a2d17c6-a211-4f41-89a0-a80ece425ed7" containerID="16e2e96364ad74d5ff462d7558221132eb14cfcade3104979445c0548fafb908" exitCode=0 Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.220148 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" event={"ID":"6a2d17c6-a211-4f41-89a0-a80ece425ed7","Type":"ContainerDied","Data":"16e2e96364ad74d5ff462d7558221132eb14cfcade3104979445c0548fafb908"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.257271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" event={"ID":"dd12c69c-a021-4a8a-a5e4-3034aa92f62c","Type":"ContainerStarted","Data":"9e8ee86e3aa42cd7dbaf7d030ba7d2889e80f84d5317c9b35c301ce8a9ee2341"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.259884 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48236: no serving certificate available for the kubelet" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.260485 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" event={"ID":"6813d755-7c94-40a7-a07c-95a2073a47fb","Type":"ContainerStarted","Data":"1ef1db041729fc7328434fca291e442dab23766cae30b034ddfb20d444e1cb13"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.268465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" event={"ID":"57f5df54-714a-4cce-970a-7069ffd1cb63","Type":"ContainerStarted","Data":"dd97ffed87f9548499af8e338a41e6da62dbf455e05dff806dae448d277a827d"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.269560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" event={"ID":"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073","Type":"ContainerStarted","Data":"c212c1281d1822507b5a3159451e30024c77c25d95e95a18d7e431d85e807d89"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.272491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" event={"ID":"b31b4b93-c90d-498b-8136-247c40d9fee2","Type":"ContainerStarted","Data":"a8e886de038d1bb54e02fcdfeea0ec9c855b1b608002ac5e44cb82dfd8683581"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.272555 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" event={"ID":"b31b4b93-c90d-498b-8136-247c40d9fee2","Type":"ContainerStarted","Data":"5b234e81bdd3d96b71406f02f4c1a616762dc1ba6f0378a66d350a2c94a3e48a"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.277019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6c2a44c9430bce4a9eea43beae3691046e33c8e580fef23d4d475708508a2a87"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.281332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" event={"ID":"c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe","Type":"ContainerStarted","Data":"594d0dbf3493c0f738e5919f87dfdc18963f140eeff857a99343f85a8d1dea09"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.281390 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" event={"ID":"c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe","Type":"ContainerStarted","Data":"e15acf68281161b4dd4a5572b6334c49d9926cb32464090ad7fef266bed8d5d5"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.284459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" event={"ID":"f58647de-0442-48c1-9647-53951b40db35","Type":"ContainerStarted","Data":"bdbc8182a7e3a2b5951523166e68b6566e89ffe3a0cbe0f3fcd65988e90ee1eb"} Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.311438 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.312838 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.313842 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.813823547 +0000 UTC m=+208.683152484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.370539 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48246: no serving certificate available for the kubelet" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.414893 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.423870 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:39.9199642 +0000 UTC m=+208.789293137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.468782 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48262: no serving certificate available for the kubelet" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.516488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.516762 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.016747749 +0000 UTC m=+208.886076686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.576959 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7xpzl" podStartSLOduration=160.576940057 podStartE2EDuration="2m40.576940057s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.574565729 +0000 UTC m=+208.443894656" watchObservedRunningTime="2026-03-18 15:37:39.576940057 +0000 UTC m=+208.446269014" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.621813 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.622202 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.122187715 +0000 UTC m=+208.991516652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.627271 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bt6v4"] Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.640270 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ltvvk"] Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.656976 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-gs8sw" podStartSLOduration=160.656953921 podStartE2EDuration="2m40.656953921s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.65340213 +0000 UTC m=+208.522731147" watchObservedRunningTime="2026-03-18 15:37:39.656953921 +0000 UTC m=+208.526282858" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.681646 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsbtg" podStartSLOduration=160.681626925 podStartE2EDuration="2m40.681626925s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.67658149 +0000 UTC m=+208.545910437" watchObservedRunningTime="2026-03-18 15:37:39.681626925 +0000 UTC m=+208.550955862" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.725096 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.725668 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.225652799 +0000 UTC m=+209.094981736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.749933 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fh49n" podStartSLOduration=160.747028114 podStartE2EDuration="2m40.747028114s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.743868978 +0000 UTC m=+208.613197915" watchObservedRunningTime="2026-03-18 15:37:39.747028114 +0000 UTC m=+208.616357051" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.765541 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r"] Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.785001 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb"] Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.799590 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" podStartSLOduration=160.799569281 podStartE2EDuration="2m40.799569281s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.797499185 +0000 UTC m=+208.666828122" watchObservedRunningTime="2026-03-18 15:37:39.799569281 +0000 UTC m=+208.668898228" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.827167 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.827459 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.327447113 +0000 UTC m=+209.196776050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.852948 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8df8j" podStartSLOduration=160.852929398 podStartE2EDuration="2m40.852929398s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.818585248 +0000 UTC m=+208.687914185" watchObservedRunningTime="2026-03-18 15:37:39.852929398 +0000 UTC m=+208.722258335" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.854686 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cwwkb" podStartSLOduration=160.854678941 podStartE2EDuration="2m40.854678941s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.852436649 +0000 UTC m=+208.721765606" watchObservedRunningTime="2026-03-18 15:37:39.854678941 +0000 UTC m=+208.724007878" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.890240 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.929586 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:39 crc kubenswrapper[4792]: E0318 15:37:39.929907 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.429890389 +0000 UTC m=+209.299219326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.953722 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" podStartSLOduration=160.953706043 podStartE2EDuration="2m40.953706043s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.951074857 +0000 UTC m=+208.820403794" watchObservedRunningTime="2026-03-18 15:37:39.953706043 +0000 UTC m=+208.823034980" Mar 18 15:37:39 crc kubenswrapper[4792]: I0318 15:37:39.993318 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nc8" podStartSLOduration=160.993294945 podStartE2EDuration="2m40.993294945s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:39.988853432 +0000 UTC m=+208.858182389" watchObservedRunningTime="2026-03-18 15:37:39.993294945 +0000 UTC m=+208.862623882" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.030463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.030868 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.530846802 +0000 UTC m=+209.400175739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.039924 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lt4p5"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.077076 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9j5pp" podStartSLOduration=161.077049857 podStartE2EDuration="2m41.077049857s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.062401389 +0000 UTC m=+208.931730326" watchObservedRunningTime="2026-03-18 15:37:40.077049857 +0000 UTC m=+208.946378794" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.087276 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.136541 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.136938 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.636906791 +0000 UTC m=+209.506235728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.137094 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.137580 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.637569796 +0000 UTC m=+209.506898733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.161048 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.164103 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.182771 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48272: no serving certificate available for the kubelet" Mar 18 15:37:40 crc kubenswrapper[4792]: W0318 15:37:40.237341 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63a36577_fe9f_43d6_b1d9_5c918e1161b2.slice/crio-64a216fe23fc781d74ea6d3bd23e3cfb0e99f68dd6d5e2c0856dca5d45ea3497 WatchSource:0}: Error finding container 64a216fe23fc781d74ea6d3bd23e3cfb0e99f68dd6d5e2c0856dca5d45ea3497: Status 404 returned error can't find the container with id 64a216fe23fc781d74ea6d3bd23e3cfb0e99f68dd6d5e2c0856dca5d45ea3497 Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.238507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.238784 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.738766767 +0000 UTC m=+209.608095704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.245299 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h4rq9"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.339607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.340047 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.840030511 +0000 UTC m=+209.709359498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.383644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" event={"ID":"b31b4b93-c90d-498b-8136-247c40d9fee2","Type":"ContainerStarted","Data":"a32839dc567052dec4cc0330e6b17dad203496570130fcd207cf9deeb01d0406"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.417187 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" event={"ID":"106c799a-83d0-4815-ab5a-61c2b67b86f7","Type":"ContainerStarted","Data":"407128762f2dcdb24436dfdd570fc686b98d95b5b4eea75f6bdf4c0da85f7fd1"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.430823 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" event={"ID":"52f8e1dd-171e-474f-b424-e879a4d73a5e","Type":"ContainerStarted","Data":"68f045d81776c8c2d646478466bf8a5ce0005f84b5ee6163be4303e089f906d4"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.437421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" event={"ID":"f75c6f2b-0965-4e4d-9445-dc65d69c970b","Type":"ContainerStarted","Data":"c5b50d2a590a8960ee912edb8a0d52f6715844f288b266cc28c2977deffd741a"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.440495 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.440678 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.940656121 +0000 UTC m=+209.809985058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.441174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.441532 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:40.941511863 +0000 UTC m=+209.810840800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.464496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" event={"ID":"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a","Type":"ContainerStarted","Data":"1227926d3bcebb5ed881c132375ae2d45d16a5b5e6ef41a46804178dfe7dd99d"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.477114 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gb4xw" podStartSLOduration=161.477092227 podStartE2EDuration="2m41.477092227s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.465119318 +0000 UTC m=+209.334448275" watchObservedRunningTime="2026-03-18 15:37:40.477092227 +0000 UTC m=+209.346421164" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.499276 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.503598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-45c2x" event={"ID":"d93d3069-e4d6-4291-9ef0-d07cba34401c","Type":"ContainerStarted","Data":"083b2d03562f0a3b0e645e7f073159610f2b2948f0b318558d2200ac4f84fbee"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.526321 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.528332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" event={"ID":"57f5df54-714a-4cce-970a-7069ffd1cb63","Type":"ContainerStarted","Data":"6716d080072bceadd2ed5e3ce9167274a019a1d2afa7a5fdd7e6168de9ae7eff"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.530101 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2m75"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.530711 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.531022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h4rq9" event={"ID":"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5","Type":"ContainerStarted","Data":"c29190e9edb76271227a69183b60a6f82879d20379295d3858ab5b30abe344cd"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.531638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" event={"ID":"63a36577-fe9f-43d6-b1d9-5c918e1161b2","Type":"ContainerStarted","Data":"64a216fe23fc781d74ea6d3bd23e3cfb0e99f68dd6d5e2c0856dca5d45ea3497"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.533346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l8284"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.550871 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.551012 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.050971257 +0000 UTC m=+209.920300204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.551929 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.552100 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.552430 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.05241736 +0000 UTC m=+209.921746297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.554684 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-45c2x" podStartSLOduration=6.554665462 podStartE2EDuration="6.554665462s" podCreationTimestamp="2026-03-18 15:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.550667986 +0000 UTC m=+209.419996933" watchObservedRunningTime="2026-03-18 15:37:40.554665462 +0000 UTC m=+209.423994399" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.558716 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" event={"ID":"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073","Type":"ContainerStarted","Data":"a081cddc76226392f2f0bf7390018718acdea88add2769fee03d4a57ae80a172"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.558754 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.577061 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-qw2mp"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.578910 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564136-wt69c"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.584411 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.584477 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.595904 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" podStartSLOduration=161.595886974 podStartE2EDuration="2m41.595886974s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.594797074 +0000 UTC m=+209.464126021" watchObservedRunningTime="2026-03-18 15:37:40.595886974 +0000 UTC m=+209.465215911" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.612202 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0c8715a-4133-49bd-b48f-12377582b8ce" containerID="0b592d76ba3a5903e90b7d128911f844fec0fcda2ea6adea259b425f459af69a" exitCode=0 Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.612307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" event={"ID":"e0c8715a-4133-49bd-b48f-12377582b8ce","Type":"ContainerDied","Data":"0b592d76ba3a5903e90b7d128911f844fec0fcda2ea6adea259b425f459af69a"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.614018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrqhk"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.614813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ltvvk" event={"ID":"01af365e-5f9a-4030-b54e-ebee4cf39552","Type":"ContainerStarted","Data":"b1fa48d46dea1529f4e5927db1872c54f8294c8213c0f7d2600d834776b8dea4"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.626859 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b6vrp"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.630793 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podStartSLOduration=161.630778004 podStartE2EDuration="2m41.630778004s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.629362582 +0000 UTC m=+209.498691529" watchObservedRunningTime="2026-03-18 15:37:40.630778004 +0000 UTC m=+209.500106941" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.632687 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" event={"ID":"b9a5245d-b1d7-4826-8f4b-7dadfbae4263","Type":"ContainerStarted","Data":"ab1b6e1fe90c8f0afd1a6596bc146d84dfc62357d708e07609ae52a3c6b6c67e"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.643316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" event={"ID":"79d401c0-bb01-40eb-a81f-cf63a0762747","Type":"ContainerStarted","Data":"e0402c5211050a2dac3305759f1ca4dad1976efe7c998ce75c7eeb91b2d0f966"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.657899 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.659263 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.159242538 +0000 UTC m=+210.028571475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.676357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tfth7" event={"ID":"a0f79eb1-598c-4d72-af53-a928520fa0d9","Type":"ContainerStarted","Data":"a43dc5931abd19ba1c34b2d164b4b1ab3bfcaf343621632dc9419f8b6a728393"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.689411 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v"] Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.701092 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" event={"ID":"ac30b157-a927-4aa3-898d-917f8efd8338","Type":"ContainerStarted","Data":"7894cd5cd8ed62177dac3dd064318440fd9a1f9f630912f8aa7a0930a3b0918f"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.712074 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tfth7" podStartSLOduration=161.712059264 podStartE2EDuration="2m41.712059264s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.71139232 +0000 UTC m=+209.580721257" watchObservedRunningTime="2026-03-18 15:37:40.712059264 +0000 UTC m=+209.581388201" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.719359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" event={"ID":"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5","Type":"ContainerStarted","Data":"aab99b0833efc2a7d4f9975228d87d14fd0041d03f6dd9b636ad2a53bf61dd21"} Mar 18 15:37:40 crc kubenswrapper[4792]: W0318 15:37:40.724584 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d216655_c83a_4f17_9e9a_367579911a35.slice/crio-e5fa6dfe32838f38ef135977cd7d445ea6dcb7ddafaf8d5d7ccf32d28f6a19e0 WatchSource:0}: Error finding container e5fa6dfe32838f38ef135977cd7d445ea6dcb7ddafaf8d5d7ccf32d28f6a19e0: Status 404 returned error can't find the container with id e5fa6dfe32838f38ef135977cd7d445ea6dcb7ddafaf8d5d7ccf32d28f6a19e0 Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.733599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" event={"ID":"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0","Type":"ContainerStarted","Data":"6f4fe7e0e3653358800b16acd428ad2f3d94c72da7ab6b2174e5a220d6d12c18"} Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.760667 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.763113 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.263099146 +0000 UTC m=+210.132428083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.776603 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.784780 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" podStartSLOduration=161.78475931 podStartE2EDuration="2m41.78475931s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.747990122 +0000 UTC m=+209.617319079" watchObservedRunningTime="2026-03-18 15:37:40.78475931 +0000 UTC m=+209.654088247" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.785576 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" podStartSLOduration=161.78556972 podStartE2EDuration="2m41.78556972s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:40.783162922 +0000 UTC m=+209.652491879" watchObservedRunningTime="2026-03-18 15:37:40.78556972 +0000 UTC m=+209.654898657" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.862107 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.864142 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.364121161 +0000 UTC m=+210.233450098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.889263 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.889935 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.890026 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 15:37:40 crc kubenswrapper[4792]: I0318 15:37:40.964739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:40 crc kubenswrapper[4792]: E0318 15:37:40.965153 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.465138836 +0000 UTC m=+210.334467773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.066424 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.066814 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.566798133 +0000 UTC m=+210.436127070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.168127 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.168550 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.668513944 +0000 UTC m=+210.537842871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.270805 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.271094 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.771080875 +0000 UTC m=+210.640409812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.375614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.376238 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.876225461 +0000 UTC m=+210.745554399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.476898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.477151 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.977133842 +0000 UTC m=+210.846462779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.477420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.477815 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:41.977794707 +0000 UTC m=+210.847123644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.524378 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48280: no serving certificate available for the kubelet" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.579494 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.580161 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.08014485 +0000 UTC m=+210.949473787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.681741 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.695372 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.182082708 +0000 UTC m=+211.051411645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.746152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qbhrj" event={"ID":"869a2d9d-cf2d-44bd-87a3-4536e5a6c2e0","Type":"ContainerStarted","Data":"3bd8e8acbd4656612139f238b360cf85c4770516b6c562fee859eea79fc0942c"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.759377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" event={"ID":"b9a5245d-b1d7-4826-8f4b-7dadfbae4263","Type":"ContainerStarted","Data":"c852c7618cf0e4b6c7a4233953acf9bf7a6a563b16b8917995e295eba6de6281"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.768381 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" event={"ID":"4a397c1a-6373-41ad-b12c-c56ff3afbff0","Type":"ContainerStarted","Data":"dd31c588e6e21b3a6e8c6806ef68af5da9a7631eae231960a048f654e3580e1d"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.768434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" event={"ID":"4a397c1a-6373-41ad-b12c-c56ff3afbff0","Type":"ContainerStarted","Data":"d805775f41c0fe6b44d1ee7cc24419f74d141de75cfc91b6e8c13d5ed7791393"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.769302 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.776757 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" event={"ID":"0ca9e0b0-a082-4128-a24a-846d0fc5e7a5","Type":"ContainerStarted","Data":"25ebcbe0c85ed4bf781d546e89683eedb7ed817f8ea1527e348ced9d6927f7d4"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.783565 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.786846 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.286818389 +0000 UTC m=+211.156147336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.786952 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l8284 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.786995 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.787008 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.787417 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.287407031 +0000 UTC m=+211.156735968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.788037 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6rvj" podStartSLOduration=162.788020513 podStartE2EDuration="2m42.788020513s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:41.787736123 +0000 UTC m=+210.657065080" watchObservedRunningTime="2026-03-18 15:37:41.788020513 +0000 UTC m=+210.657349450" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.800638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" event={"ID":"106c799a-83d0-4815-ab5a-61c2b67b86f7","Type":"ContainerStarted","Data":"2f0cc99480209f97cadc4df5d5eb1a81d298e54cea1154fceb6c603cd4b73530"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.801074 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.817783 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" event={"ID":"52f8e1dd-171e-474f-b424-e879a4d73a5e","Type":"ContainerStarted","Data":"d68975e41e3679dfe5d1215315e7e1968b61bcec84ff5abdddf4e31c909dfa47"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.830427 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" event={"ID":"3175cda1-84cb-469d-9f49-028132d324ea","Type":"ContainerStarted","Data":"034e51d24cebf7661c1ff34ae78f7b1ef275a78ff8dfb4cbc7a4c715c541ecd7"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.843174 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.843235 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.863752 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ql7f8" podStartSLOduration=162.86373724 podStartE2EDuration="2m42.86373724s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:41.862026807 +0000 UTC m=+210.731355754" watchObservedRunningTime="2026-03-18 15:37:41.86373724 +0000 UTC m=+210.733066177" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.870143 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" podStartSLOduration=162.870125734 podStartE2EDuration="2m42.870125734s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:41.826368169 +0000 UTC m=+210.695697136" watchObservedRunningTime="2026-03-18 15:37:41.870125734 +0000 UTC m=+210.739454671" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.888663 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.889001 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.388944995 +0000 UTC m=+211.258273932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.892450 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"483bd93698b42491e497017b789b96cff076b0a93904e55640de0b80d6a93f83"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.892491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b25c030f6f7cb2853135b04ae2a636221afa0ff8dda4fdf6739eee06339b9db3"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.892503 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" event={"ID":"2c77956c-88bf-4e94-a8de-a41728753ccd","Type":"ContainerStarted","Data":"1bca90bfe8f505b35b43705bca61006e30650fd20a2a8fc4daa0d3a4946760d4"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.892516 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" event={"ID":"2c77956c-88bf-4e94-a8de-a41728753ccd","Type":"ContainerStarted","Data":"e03f96735acf3f2c8acdacc34f5cb2057cb2de19b5f125bea3dd53b1e6feaa39"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.901372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564136-wt69c" event={"ID":"e094f66b-fe57-429a-b5cd-de6084a8aacb","Type":"ContainerStarted","Data":"3801e39f6838dd3a2128ee9d248b3994b469121df89f4c0b44757184d8fd1dd8"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.904281 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:41 crc kubenswrapper[4792]: E0318 15:37:41.907199 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.407183893 +0000 UTC m=+211.276512830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.907837 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:41 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:41 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:41 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.907871 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.945807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ltvvk" event={"ID":"01af365e-5f9a-4030-b54e-ebee4cf39552","Type":"ContainerStarted","Data":"0521ec6c0bd6bd343b396cc47be3cb0675619a64e7ff5f12267035765bed9ede"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.946108 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ltvvk" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.948068 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.948110 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ltvvk" podUID="01af365e-5f9a-4030-b54e-ebee4cf39552" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.948739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" event={"ID":"7d216655-c83a-4f17-9e9a-367579911a35","Type":"ContainerStarted","Data":"46795b6954ac735a1fadd498acb7007c49a9c582d7208192845fb7c1fd27d5de"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.948788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" event={"ID":"7d216655-c83a-4f17-9e9a-367579911a35","Type":"ContainerStarted","Data":"e5fa6dfe32838f38ef135977cd7d445ea6dcb7ddafaf8d5d7ccf32d28f6a19e0"} Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.949379 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:37:41 crc kubenswrapper[4792]: I0318 15:37:41.967308 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" event={"ID":"be2f97af-46a9-455f-bc78-9b25a736d1f5","Type":"ContainerStarted","Data":"53d219af0dfcf3ef878f41aa40a8ed82b808541cc642e609a610b01ac2778a53"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.012064 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.012221 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.512190625 +0000 UTC m=+211.381519572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.013129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.013430 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.513418709 +0000 UTC m=+211.382747646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.015506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h4rq9" event={"ID":"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5","Type":"ContainerStarted","Data":"213f72edd5f352964022bc40f7678eb61be21d02f535c7a43ea337229c7359aa"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.035392 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" event={"ID":"6a2d17c6-a211-4f41-89a0-a80ece425ed7","Type":"ContainerStarted","Data":"844f8cdb9ef046ef6b6a436ec522f57819f21beefa5ec69d5d4e8ef42e5d655e"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.046216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" event={"ID":"6a509c1f-d106-4f35-9226-58a6779b738b","Type":"ContainerStarted","Data":"73d340c04c91fbac902ca6fb2db6b7f83e5abdb6d751a7c1b50dfe85d4d72c50"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.046261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" event={"ID":"6a509c1f-d106-4f35-9226-58a6779b738b","Type":"ContainerStarted","Data":"7e1a99a416c10e048dbffee037fb53d6bbc15d5032974ef5b6ee80ecfcc1d99f"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.047085 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.073796 4792 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mm4h4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.073842 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" podUID="6a509c1f-d106-4f35-9226-58a6779b738b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.079755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" event={"ID":"63a36577-fe9f-43d6-b1d9-5c918e1161b2","Type":"ContainerStarted","Data":"6bd9e5eb4900aca8dcf9d9dfdb5281532668a2a04ea90185c8937c76b2e5c108"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.079793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" event={"ID":"63a36577-fe9f-43d6-b1d9-5c918e1161b2","Type":"ContainerStarted","Data":"c2a15505c67bc0356fbfaa0c6c0b2619f978ced102da032406aeb85404ce0e1c"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.085241 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b6vrp" event={"ID":"57f057e4-26f7-4d1f-92bb-18886619837c","Type":"ContainerStarted","Data":"77486a0d1e5d7c7df94e00d21ff43c71370abfbd1cf6a7dff04e9da43cccad28"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.085274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b6vrp" event={"ID":"57f057e4-26f7-4d1f-92bb-18886619837c","Type":"ContainerStarted","Data":"6299473d224ff42da461189cbb2a44fb8d812da396f17419e816f3c077d6ea21"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.089325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9be16b18e1ff128b4fa894fa31732e55ec6b79ae074fa9789cd39a690a089cdd"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.112888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" event={"ID":"f43e9d58-ee06-48d9-bed5-6b2e039d0a3a","Type":"ContainerStarted","Data":"57dafa86a3a533689081231d9b8189e7356dd1c5f333ccf31009f84d2fee6900"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.113960 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.114940 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.614915291 +0000 UTC m=+211.484244228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.150568 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" event={"ID":"bfb00e07-7b51-4687-8cf9-6e285f133f5b","Type":"ContainerStarted","Data":"0d9bd1216121be17b9a79927af7d375cde75a5d691ad1678d874733932f60a4a"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.157478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" event={"ID":"f75c6f2b-0965-4e4d-9445-dc65d69c970b","Type":"ContainerStarted","Data":"0c824d7f150e3975b6d6430685580dea4d25e8bb1f7f88b7bf95a6a4e35e9fe7"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.158209 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.160008 4792 generic.go:334] "Generic (PLEG): container finished" podID="39d1af38-2c59-403e-a97e-2e22ac2737b3" containerID="84002a2fcae0f70e8327dfced3ba3f0d184ab0e92bf4a5acc0287004cd5d6d8a" exitCode=0 Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.160135 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" event={"ID":"39d1af38-2c59-403e-a97e-2e22ac2737b3","Type":"ContainerDied","Data":"84002a2fcae0f70e8327dfced3ba3f0d184ab0e92bf4a5acc0287004cd5d6d8a"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.164106 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xmfwt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.164160 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" podUID="f75c6f2b-0965-4e4d-9445-dc65d69c970b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.178754 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" event={"ID":"c2ff6f1e-85ce-4aa9-b3ac-73b0d78abcbe","Type":"ContainerStarted","Data":"dfd979f6bb43e0e3357f9690c2fa23dbecb402327ec82f7b84e4ac54e1edb62b"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.213007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-85pml" event={"ID":"ac30b157-a927-4aa3-898d-917f8efd8338","Type":"ContainerStarted","Data":"512bfad1737aa7114ae6b3d564740646c3b99c178bef6082e4949264294edd04"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.215008 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.217530 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.717513634 +0000 UTC m=+211.586842571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.251184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" event={"ID":"e0c8715a-4133-49bd-b48f-12377582b8ce","Type":"ContainerStarted","Data":"aacd91601c80ba40ebad0e42ffd7fe7ce468b8ac1c3aabaaf6df6388d8aba4dd"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.251920 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.255538 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.255920 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.276083 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" event={"ID":"b0f66845-303e-42cf-b091-be0ac57cba20","Type":"ContainerStarted","Data":"ce4e6aa1a94bc74626e7191d0dd1704e76511acbe2b60678a73bb3f5a531d3d0"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.276142 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" event={"ID":"b0f66845-303e-42cf-b091-be0ac57cba20","Type":"ContainerStarted","Data":"2cee006ee668b111144efe8aa521c9afd261d54b136296578514293362de80ed"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.293000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" event={"ID":"79d401c0-bb01-40eb-a81f-cf63a0762747","Type":"ContainerStarted","Data":"0ca16d5e8b700f5f0ae97d71331115f61be58bc7b71e1039640b15d30d14e766"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.293221 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" podUID="79d401c0-bb01-40eb-a81f-cf63a0762747" containerName="route-controller-manager" containerID="cri-o://0ca16d5e8b700f5f0ae97d71331115f61be58bc7b71e1039640b15d30d14e766" gracePeriod=30 Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.293698 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.315596 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.326964 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" event={"ID":"6813d755-7c94-40a7-a07c-95a2073a47fb","Type":"ContainerStarted","Data":"c7f7e761eb695f251afd73ca9d3e6f524b4e84d0243c609e1ef75c198e0a37ec"} Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.330711 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.830693535 +0000 UTC m=+211.700022472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.345395 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6f526f76a0eed45616ef93b587ed6bf61c108d3f7e67b9b880c938520eb9073b"} Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.345777 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" podUID="c323f5d1-220a-41eb-a3db-e56dedfafc29" containerName="controller-manager" containerID="cri-o://78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb" gracePeriod=30 Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.350615 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.418940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.420862 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:42.920848051 +0000 UTC m=+211.790176988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.438605 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podStartSLOduration=163.438586791 podStartE2EDuration="2m43.438586791s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.39872025 +0000 UTC m=+211.268049187" watchObservedRunningTime="2026-03-18 15:37:42.438586791 +0000 UTC m=+211.307915728" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.479096 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lt4p5" podStartSLOduration=163.479080847 podStartE2EDuration="2m43.479080847s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.43909226 +0000 UTC m=+211.308421207" watchObservedRunningTime="2026-03-18 15:37:42.479080847 +0000 UTC m=+211.348409784" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.522680 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.523217 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.023202014 +0000 UTC m=+211.892530951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.609199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ltvvk" podStartSLOduration=163.609183567 podStartE2EDuration="2m43.609183567s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.60787906 +0000 UTC m=+211.477207997" watchObservedRunningTime="2026-03-18 15:37:42.609183567 +0000 UTC m=+211.478512494" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.621569 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.624745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.625071 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.12505791 +0000 UTC m=+211.994386847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.630181 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mrbj7" podStartSLOduration=163.630164567 podStartE2EDuration="2m43.630164567s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.630075644 +0000 UTC m=+211.499404571" watchObservedRunningTime="2026-03-18 15:37:42.630164567 +0000 UTC m=+211.499493504" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.655367 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zcnn4" podStartSLOduration=163.655353481 podStartE2EDuration="2m43.655353481s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.654330143 +0000 UTC m=+211.523659080" watchObservedRunningTime="2026-03-18 15:37:42.655353481 +0000 UTC m=+211.524682418" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.726603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.726732 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.226709717 +0000 UTC m=+212.096038654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.727060 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.727371 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.227354581 +0000 UTC m=+212.096683518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.834232 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b6vrp" podStartSLOduration=8.83421418 podStartE2EDuration="8.83421418s" podCreationTimestamp="2026-03-18 15:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.767358919 +0000 UTC m=+211.636687866" watchObservedRunningTime="2026-03-18 15:37:42.83421418 +0000 UTC m=+211.703543127" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.837860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.838451 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.338434426 +0000 UTC m=+212.207763363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.868930 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" podStartSLOduration=163.868906873 podStartE2EDuration="2m43.868906873s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.837416628 +0000 UTC m=+211.706745575" watchObservedRunningTime="2026-03-18 15:37:42.868906873 +0000 UTC m=+211.738235820" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.924344 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:42 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:42 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:42 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.924448 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.946533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:42 crc kubenswrapper[4792]: E0318 15:37:42.946996 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.446970206 +0000 UTC m=+212.316299143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:42 crc kubenswrapper[4792]: I0318 15:37:42.965737 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" podStartSLOduration=163.965719413 podStartE2EDuration="2m43.965719413s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.873083566 +0000 UTC m=+211.742412503" watchObservedRunningTime="2026-03-18 15:37:42.965719413 +0000 UTC m=+211.835048360" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.001340 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" podStartSLOduration=164.001325008 podStartE2EDuration="2m44.001325008s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.999031775 +0000 UTC m=+211.868360712" watchObservedRunningTime="2026-03-18 15:37:43.001325008 +0000 UTC m=+211.870653945" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.009116 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" podStartSLOduration=164.009097464 podStartE2EDuration="2m44.009097464s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:42.972205821 +0000 UTC m=+211.841534778" watchObservedRunningTime="2026-03-18 15:37:43.009097464 +0000 UTC m=+211.878426401" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.025393 4792 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9j26r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": read tcp 10.217.0.2:42670->10.217.0.27:8443: read: connection reset by peer" start-of-body= Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.025561 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" podUID="79d401c0-bb01-40eb-a81f-cf63a0762747" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": read tcp 10.217.0.2:42670->10.217.0.27:8443: read: connection reset by peer" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.050268 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.050563 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.550548264 +0000 UTC m=+212.419877201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.075567 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ckcd4" podStartSLOduration=164.075547731 podStartE2EDuration="2m44.075547731s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.049747165 +0000 UTC m=+211.919076112" watchObservedRunningTime="2026-03-18 15:37:43.075547731 +0000 UTC m=+211.944876668" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.152199 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.152613 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.652598806 +0000 UTC m=+212.521927743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.180038 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xknnv" podStartSLOduration=164.180024243 podStartE2EDuration="2m44.180024243s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.177298772 +0000 UTC m=+212.046627709" watchObservedRunningTime="2026-03-18 15:37:43.180024243 +0000 UTC m=+212.049353180" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.199239 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.217719 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.220041 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vtf8t" podStartSLOduration=164.220023359 podStartE2EDuration="2m44.220023359s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.217462255 +0000 UTC m=+212.086791192" watchObservedRunningTime="2026-03-18 15:37:43.220023359 +0000 UTC m=+212.089352296" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.244221 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cjxlb" podStartSLOduration=164.244206316 podStartE2EDuration="2m44.244206316s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.242459492 +0000 UTC m=+212.111788419" watchObservedRunningTime="2026-03-18 15:37:43.244206316 +0000 UTC m=+212.113535253" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.253269 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.253587 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.753571869 +0000 UTC m=+212.622900806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.278993 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" podStartSLOduration=164.27895344 podStartE2EDuration="2m44.27895344s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.278557085 +0000 UTC m=+212.147886012" watchObservedRunningTime="2026-03-18 15:37:43.27895344 +0000 UTC m=+212.148282377" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.319831 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" podStartSLOduration=164.319813269 podStartE2EDuration="2m44.319813269s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.319360182 +0000 UTC m=+212.188689129" watchObservedRunningTime="2026-03-18 15:37:43.319813269 +0000 UTC m=+212.189142196" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.353814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn67g\" (UniqueName: \"kubernetes.io/projected/c323f5d1-220a-41eb-a3db-e56dedfafc29-kube-api-access-dn67g\") pod \"c323f5d1-220a-41eb-a3db-e56dedfafc29\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.353945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c323f5d1-220a-41eb-a3db-e56dedfafc29-serving-cert\") pod \"c323f5d1-220a-41eb-a3db-e56dedfafc29\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.354252 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-proxy-ca-bundles\") pod \"c323f5d1-220a-41eb-a3db-e56dedfafc29\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.354352 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-config\") pod \"c323f5d1-220a-41eb-a3db-e56dedfafc29\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.354464 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-client-ca\") pod \"c323f5d1-220a-41eb-a3db-e56dedfafc29\" (UID: \"c323f5d1-220a-41eb-a3db-e56dedfafc29\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.354660 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.355468 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-client-ca" (OuterVolumeSpecName: "client-ca") pod "c323f5d1-220a-41eb-a3db-e56dedfafc29" (UID: "c323f5d1-220a-41eb-a3db-e56dedfafc29"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.355484 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.855470447 +0000 UTC m=+212.724799384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.355655 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c323f5d1-220a-41eb-a3db-e56dedfafc29" (UID: "c323f5d1-220a-41eb-a3db-e56dedfafc29"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.355672 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-config" (OuterVolumeSpecName: "config") pod "c323f5d1-220a-41eb-a3db-e56dedfafc29" (UID: "c323f5d1-220a-41eb-a3db-e56dedfafc29"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.367046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c323f5d1-220a-41eb-a3db-e56dedfafc29-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c323f5d1-220a-41eb-a3db-e56dedfafc29" (UID: "c323f5d1-220a-41eb-a3db-e56dedfafc29"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.367637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" event={"ID":"7d216655-c83a-4f17-9e9a-367579911a35","Type":"ContainerStarted","Data":"a183b5757632fc1e374a7fbe1e1094bd42ae44fc4095ed761d395374aea104b1"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.373095 4792 generic.go:334] "Generic (PLEG): container finished" podID="c323f5d1-220a-41eb-a3db-e56dedfafc29" containerID="78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb" exitCode=0 Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.373146 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" event={"ID":"c323f5d1-220a-41eb-a3db-e56dedfafc29","Type":"ContainerDied","Data":"78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.373166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" event={"ID":"c323f5d1-220a-41eb-a3db-e56dedfafc29","Type":"ContainerDied","Data":"cbfde79dff05edb763a16a521be5bbbb875f0052c6608fc04d54fe4cd374fbfe"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.373181 4792 scope.go:117] "RemoveContainer" containerID="78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.373273 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x2m75" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.376411 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c323f5d1-220a-41eb-a3db-e56dedfafc29-kube-api-access-dn67g" (OuterVolumeSpecName: "kube-api-access-dn67g") pod "c323f5d1-220a-41eb-a3db-e56dedfafc29" (UID: "c323f5d1-220a-41eb-a3db-e56dedfafc29"). InnerVolumeSpecName "kube-api-access-dn67g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.398615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" event={"ID":"39d1af38-2c59-403e-a97e-2e22ac2737b3","Type":"ContainerStarted","Data":"7f826d794d6886062b5e6cff2f72b913f26f52fd3fb2272681e70821df27cd67"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.404627 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" podStartSLOduration=164.404608008 podStartE2EDuration="2m44.404608008s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.365691561 +0000 UTC m=+212.235020508" watchObservedRunningTime="2026-03-18 15:37:43.404608008 +0000 UTC m=+212.273936945" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.428114 4792 scope.go:117] "RemoveContainer" containerID="78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.431819 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb\": container with ID starting with 78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb not found: ID does not exist" containerID="78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.431853 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb"} err="failed to get container status \"78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb\": rpc error: code = NotFound desc = could not find container \"78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb\": container with ID starting with 78103d55acb28c9995267c75073be42908ac6eb41c8578b41c907d9afd261aeb not found: ID does not exist" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.434418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" event={"ID":"3175cda1-84cb-469d-9f49-028132d324ea","Type":"ContainerStarted","Data":"e7f05b37ad6ca6b878fab084ad31ab16181f1bec88838c8a272e1e309bd795f6"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.453588 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d68fbb675-wx2nv"] Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.453792 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c323f5d1-220a-41eb-a3db-e56dedfafc29" containerName="controller-manager" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.453802 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c323f5d1-220a-41eb-a3db-e56dedfafc29" containerName="controller-manager" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.453897 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c323f5d1-220a-41eb-a3db-e56dedfafc29" containerName="controller-manager" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.458593 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.458958 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.458973 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.458997 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c323f5d1-220a-41eb-a3db-e56dedfafc29-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.459008 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn67g\" (UniqueName: \"kubernetes.io/projected/c323f5d1-220a-41eb-a3db-e56dedfafc29-kube-api-access-dn67g\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.459018 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c323f5d1-220a-41eb-a3db-e56dedfafc29-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.459668 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:43.959652707 +0000 UTC m=+212.828981644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.467161 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.468255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h4rq9" event={"ID":"51bc1a65-c9ad-4c6c-9f53-fa5a1388a1a5","Type":"ContainerStarted","Data":"5e680ffd367a954a289236e048d4fe66d52902f541c4e61fd07ad25385f72b90"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.469032 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.481059 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d68fbb675-wx2nv"] Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.504606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" event={"ID":"bfb00e07-7b51-4687-8cf9-6e285f133f5b","Type":"ContainerStarted","Data":"6b85dc6e5603a0816a74a73c88eabae98ae76657ddcbcd050a716db3d6646b18"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.504657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" event={"ID":"bfb00e07-7b51-4687-8cf9-6e285f133f5b","Type":"ContainerStarted","Data":"a6e730954c68fd8071d33fe80394ad5ea25da5543a0d720e9efde1b70a2412ef"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.521447 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-9j26r_79d401c0-bb01-40eb-a81f-cf63a0762747/route-controller-manager/0.log" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.521490 4792 generic.go:334] "Generic (PLEG): container finished" podID="79d401c0-bb01-40eb-a81f-cf63a0762747" containerID="0ca16d5e8b700f5f0ae97d71331115f61be58bc7b71e1039640b15d30d14e766" exitCode=255 Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.522672 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" event={"ID":"79d401c0-bb01-40eb-a81f-cf63a0762747","Type":"ContainerDied","Data":"0ca16d5e8b700f5f0ae97d71331115f61be58bc7b71e1039640b15d30d14e766"} Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.524846 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l8284 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.524887 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.543328 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.543373 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ltvvk" podUID="01af365e-5f9a-4030-b54e-ebee4cf39552" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.547340 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.555379 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wjwpn" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.561303 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.563400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-proxy-ca-bundles\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.563522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.563553 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4n4g\" (UniqueName: \"kubernetes.io/projected/a2c12f09-78e4-4760-a527-2b90d7a10e99-kube-api-access-t4n4g\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.563569 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-client-ca\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.563595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-config\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.563625 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c12f09-78e4-4760-a527-2b90d7a10e99-serving-cert\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.563922 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.06390917 +0000 UTC m=+212.933238097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.567172 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h4rq9" podStartSLOduration=8.567153519 podStartE2EDuration="8.567153519s" podCreationTimestamp="2026-03-18 15:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.548191364 +0000 UTC m=+212.417520301" watchObservedRunningTime="2026-03-18 15:37:43.567153519 +0000 UTC m=+212.436482456" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.607062 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrqhk" podStartSLOduration=164.607039182 podStartE2EDuration="2m44.607039182s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:43.602403413 +0000 UTC m=+212.471732350" watchObservedRunningTime="2026-03-18 15:37:43.607039182 +0000 UTC m=+212.476368109" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.653849 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-9j26r_79d401c0-bb01-40eb-a81f-cf63a0762747/route-controller-manager/0.log" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.653920 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.667556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.684684 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4n4g\" (UniqueName: \"kubernetes.io/projected/a2c12f09-78e4-4760-a527-2b90d7a10e99-kube-api-access-t4n4g\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.684767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-client-ca\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.685056 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-config\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.685259 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c12f09-78e4-4760-a527-2b90d7a10e99-serving-cert\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.685376 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-proxy-ca-bundles\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.686912 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.1868929 +0000 UTC m=+213.056221847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.703802 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-client-ca\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.719334 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-config\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.775262 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-proxy-ca-bundles\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.776895 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c12f09-78e4-4760-a527-2b90d7a10e99-serving-cert\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.778331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4n4g\" (UniqueName: \"kubernetes.io/projected/a2c12f09-78e4-4760-a527-2b90d7a10e99-kube-api-access-t4n4g\") pod \"controller-manager-7d68fbb675-wx2nv\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.787637 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgmdz\" (UniqueName: \"kubernetes.io/projected/79d401c0-bb01-40eb-a81f-cf63a0762747-kube-api-access-wgmdz\") pod \"79d401c0-bb01-40eb-a81f-cf63a0762747\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.788475 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d401c0-bb01-40eb-a81f-cf63a0762747-serving-cert\") pod \"79d401c0-bb01-40eb-a81f-cf63a0762747\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.788584 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-config\") pod \"79d401c0-bb01-40eb-a81f-cf63a0762747\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.788690 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-client-ca\") pod \"79d401c0-bb01-40eb-a81f-cf63a0762747\" (UID: \"79d401c0-bb01-40eb-a81f-cf63a0762747\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.789055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.789695 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.289684401 +0000 UTC m=+213.159013338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.791005 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-config" (OuterVolumeSpecName: "config") pod "79d401c0-bb01-40eb-a81f-cf63a0762747" (UID: "79d401c0-bb01-40eb-a81f-cf63a0762747"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.791537 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-client-ca" (OuterVolumeSpecName: "client-ca") pod "79d401c0-bb01-40eb-a81f-cf63a0762747" (UID: "79d401c0-bb01-40eb-a81f-cf63a0762747"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.793282 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d401c0-bb01-40eb-a81f-cf63a0762747-kube-api-access-wgmdz" (OuterVolumeSpecName: "kube-api-access-wgmdz") pod "79d401c0-bb01-40eb-a81f-cf63a0762747" (UID: "79d401c0-bb01-40eb-a81f-cf63a0762747"). InnerVolumeSpecName "kube-api-access-wgmdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.805574 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.813508 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79d401c0-bb01-40eb-a81f-cf63a0762747-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "79d401c0-bb01-40eb-a81f-cf63a0762747" (UID: "79d401c0-bb01-40eb-a81f-cf63a0762747"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.823109 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2m75"] Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.830469 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x2m75"] Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.892573 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:43 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:43 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:43 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.892621 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.893286 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.893531 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.893540 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79d401c0-bb01-40eb-a81f-cf63a0762747-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.893588 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgmdz\" (UniqueName: \"kubernetes.io/projected/79d401c0-bb01-40eb-a81f-cf63a0762747-kube-api-access-wgmdz\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.893597 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d401c0-bb01-40eb-a81f-cf63a0762747-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:43 crc kubenswrapper[4792]: E0318 15:37:43.893646 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.393633483 +0000 UTC m=+213.262962420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.899280 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c323f5d1-220a-41eb-a3db-e56dedfafc29" path="/var/lib/kubelet/pods/c323f5d1-220a-41eb-a3db-e56dedfafc29/volumes" Mar 18 15:37:43 crc kubenswrapper[4792]: I0318 15:37:43.906378 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.000568 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.000960 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.500947148 +0000 UTC m=+213.370276105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.101412 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.101731 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.601705753 +0000 UTC m=+213.471034690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.101928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.102379 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.602364147 +0000 UTC m=+213.471693084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.153906 4792 ???:1] "http: TLS handshake error from 192.168.126.11:48282: no serving certificate available for the kubelet" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.205804 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.205994 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.705943326 +0000 UTC m=+213.575272263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.206050 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.206628 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.706615861 +0000 UTC m=+213.575944798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.226585 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jqrw8"] Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.226811 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d401c0-bb01-40eb-a81f-cf63a0762747" containerName="route-controller-manager" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.226827 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d401c0-bb01-40eb-a81f-cf63a0762747" containerName="route-controller-manager" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.226970 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="79d401c0-bb01-40eb-a81f-cf63a0762747" containerName="route-controller-manager" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.228791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.231468 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.252858 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jqrw8"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.270532 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d68fbb675-wx2nv"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.307870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.308071 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpmfz\" (UniqueName: \"kubernetes.io/projected/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-kube-api-access-cpmfz\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.308135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-utilities\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.308160 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-catalog-content\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.308290 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.808274039 +0000 UTC m=+213.677602976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: W0318 15:37:44.324251 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c12f09_78e4_4760_a527_2b90d7a10e99.slice/crio-b409e00f06ea2019c33dfccebbacd46a405968788afba5ab1795a81243161af3 WatchSource:0}: Error finding container b409e00f06ea2019c33dfccebbacd46a405968788afba5ab1795a81243161af3: Status 404 returned error can't find the container with id b409e00f06ea2019c33dfccebbacd46a405968788afba5ab1795a81243161af3 Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.412896 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpmfz\" (UniqueName: \"kubernetes.io/projected/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-kube-api-access-cpmfz\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.412956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.413035 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-utilities\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.413058 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-catalog-content\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.413507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-catalog-content\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.413687 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:44.913670204 +0000 UTC m=+213.782999211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.413724 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-utilities\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.458029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpmfz\" (UniqueName: \"kubernetes.io/projected/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-kube-api-access-cpmfz\") pod \"community-operators-jqrw8\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.514179 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.514603 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.014581685 +0000 UTC m=+213.883910622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.514940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.515370 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.015347053 +0000 UTC m=+213.884676000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.528814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" event={"ID":"a2c12f09-78e4-4760-a527-2b90d7a10e99","Type":"ContainerStarted","Data":"b409e00f06ea2019c33dfccebbacd46a405968788afba5ab1795a81243161af3"} Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.531563 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-9j26r_79d401c0-bb01-40eb-a81f-cf63a0762747/route-controller-manager/0.log" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.531619 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" event={"ID":"79d401c0-bb01-40eb-a81f-cf63a0762747","Type":"ContainerDied","Data":"e0402c5211050a2dac3305759f1ca4dad1976efe7c998ce75c7eeb91b2d0f966"} Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.531645 4792 scope.go:117] "RemoveContainer" containerID="0ca16d5e8b700f5f0ae97d71331115f61be58bc7b71e1039640b15d30d14e766" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.531767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.539286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" event={"ID":"39d1af38-2c59-403e-a97e-2e22ac2737b3","Type":"ContainerStarted","Data":"a307e23bdcce6bcfdb8e8e395864975ef0c2e15d4a4031c50e281a07204b372b"} Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.551049 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.551290 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.555870 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.556382 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.562592 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9j26r"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.611493 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" podStartSLOduration=165.611478778 podStartE2EDuration="2m45.611478778s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:44.609463824 +0000 UTC m=+213.478792751" watchObservedRunningTime="2026-03-18 15:37:44.611478778 +0000 UTC m=+213.480807715" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.616284 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.616493 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.11645077 +0000 UTC m=+213.985779707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.626136 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ps7zj"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.627040 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.658596 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps7zj"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.720564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-utilities\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.720608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-kube-api-access-9ssjw\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.720660 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.720706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-catalog-content\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.721025 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.221009315 +0000 UTC m=+214.090338252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.821743 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.821910 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-utilities\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.821946 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-kube-api-access-9ssjw\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.822072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-catalog-content\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.822124 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.822525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-catalog-content\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.824919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-utilities\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.834703 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.835587 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.335568896 +0000 UTC m=+214.204897833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.837906 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.840159 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.853626 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.857272 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j66mt"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.868768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.869328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-kube-api-access-9ssjw\") pod \"community-operators-ps7zj\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.882943 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.888685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j66mt"] Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.889477 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:44 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:44 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:44 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.889523 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.924687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:44 crc kubenswrapper[4792]: E0318 15:37:44.925111 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.425098289 +0000 UTC m=+214.294427226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:44 crc kubenswrapper[4792]: I0318 15:37:44.974614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.028664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.029141 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.529121054 +0000 UTC m=+214.398449991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.029192 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7c8z\" (UniqueName: \"kubernetes.io/projected/b891667e-d9ed-4602-8ef2-b0461e32a955-kube-api-access-z7c8z\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.029221 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f4ace7c-3804-417a-bbdb-787805432273-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.029249 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f4ace7c-3804-417a-bbdb-787805432273-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.029295 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.029343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-utilities\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.029424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-catalog-content\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.029840 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.529830951 +0000 UTC m=+214.399159888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.036365 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b4ws8"] Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.037287 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.056771 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4ws8"] Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.089492 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jqrw8"] Mar 18 15:37:45 crc kubenswrapper[4792]: W0318 15:37:45.107100 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ba56c2f_0c1b_4201_9960_590b2cb73fb6.slice/crio-13fe3eb7d5ad330da2daafe7dd79bdd05f78c0287e42f3ac9315d0ea0430b265 WatchSource:0}: Error finding container 13fe3eb7d5ad330da2daafe7dd79bdd05f78c0287e42f3ac9315d0ea0430b265: Status 404 returned error can't find the container with id 13fe3eb7d5ad330da2daafe7dd79bdd05f78c0287e42f3ac9315d0ea0430b265 Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.130715 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.130816 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.630795263 +0000 UTC m=+214.500124200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131141 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-catalog-content\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7c8z\" (UniqueName: \"kubernetes.io/projected/b891667e-d9ed-4602-8ef2-b0461e32a955-kube-api-access-z7c8z\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131275 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f4ace7c-3804-417a-bbdb-787805432273-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131307 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f4ace7c-3804-417a-bbdb-787805432273-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-catalog-content\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131385 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76n4n\" (UniqueName: \"kubernetes.io/projected/0bd261d3-47ac-4082-bc3b-128b9b72df06-kube-api-access-76n4n\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-utilities\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.131532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-utilities\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.132008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f4ace7c-3804-417a-bbdb-787805432273-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.132050 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-utilities\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.132315 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.632301008 +0000 UTC m=+214.501630025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.135371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-catalog-content\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.178035 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f4ace7c-3804-417a-bbdb-787805432273-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.187486 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7c8z\" (UniqueName: \"kubernetes.io/projected/b891667e-d9ed-4602-8ef2-b0461e32a955-kube-api-access-z7c8z\") pod \"certified-operators-j66mt\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.236514 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.236909 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-catalog-content\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.236931 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76n4n\" (UniqueName: \"kubernetes.io/projected/0bd261d3-47ac-4082-bc3b-128b9b72df06-kube-api-access-76n4n\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.237004 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-utilities\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.237201 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.737174065 +0000 UTC m=+214.606503002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.237587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-catalog-content\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.237630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-utilities\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.261076 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.288866 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76n4n\" (UniqueName: \"kubernetes.io/projected/0bd261d3-47ac-4082-bc3b-128b9b72df06-kube-api-access-76n4n\") pod \"certified-operators-b4ws8\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.343168 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.343673 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.84365829 +0000 UTC m=+214.712987227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.378337 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.384048 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps7zj"] Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.447276 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.447496 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.947467347 +0000 UTC m=+214.816796284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.447555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:45 crc kubenswrapper[4792]: W0318 15:37:45.447704 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab5703e_d7a2_4839_b0d5_2851a53b5f6f.slice/crio-12e26abc1610172dbf0c43f6102943563ec97301f573e9f63c467d71050823bb WatchSource:0}: Error finding container 12e26abc1610172dbf0c43f6102943563ec97301f573e9f63c467d71050823bb: Status 404 returned error can't find the container with id 12e26abc1610172dbf0c43f6102943563ec97301f573e9f63c467d71050823bb Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.447994 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:45.947981575 +0000 UTC m=+214.817310512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.464865 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.549635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.550089 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.050074069 +0000 UTC m=+214.919403006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.574836 4792 generic.go:334] "Generic (PLEG): container finished" podID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerID="e7b351bf7f016c6783724fa4a7474b7c93820944ed6968a08feff2c498415b17" exitCode=0 Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.574919 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrw8" event={"ID":"8ba56c2f-0c1b-4201-9960-590b2cb73fb6","Type":"ContainerDied","Data":"e7b351bf7f016c6783724fa4a7474b7c93820944ed6968a08feff2c498415b17"} Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.574945 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrw8" event={"ID":"8ba56c2f-0c1b-4201-9960-590b2cb73fb6","Type":"ContainerStarted","Data":"13fe3eb7d5ad330da2daafe7dd79bdd05f78c0287e42f3ac9315d0ea0430b265"} Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.579041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps7zj" event={"ID":"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f","Type":"ContainerStarted","Data":"12e26abc1610172dbf0c43f6102943563ec97301f573e9f63c467d71050823bb"} Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.608734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" event={"ID":"3175cda1-84cb-469d-9f49-028132d324ea","Type":"ContainerStarted","Data":"5de151d15166f70672ed3b8e3057dab68d035242f937804123b9e63bd0957577"} Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.631935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" event={"ID":"a2c12f09-78e4-4760-a527-2b90d7a10e99","Type":"ContainerStarted","Data":"138abdbf150a14e65c626425f52c8c8106b4c24e242ac5b6bd08f92a0d0f24af"} Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.636736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.651366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.651935 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" podStartSLOduration=5.651920005 podStartE2EDuration="5.651920005s" podCreationTimestamp="2026-03-18 15:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:45.649955162 +0000 UTC m=+214.519284099" watchObservedRunningTime="2026-03-18 15:37:45.651920005 +0000 UTC m=+214.521248942" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.655706 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.656407 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.156393368 +0000 UTC m=+215.025722305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.741102 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j66mt"] Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.752462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.752601 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.252576536 +0000 UTC m=+215.121905473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.752761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.753687 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.253669026 +0000 UTC m=+215.122997963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.812433 4792 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.853482 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.853625 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.353598831 +0000 UTC m=+215.222927758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.854075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.854451 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.354436441 +0000 UTC m=+215.223765378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.873379 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d401c0-bb01-40eb-a81f-cf63a0762747" path="/var/lib/kubelet/pods/79d401c0-bb01-40eb-a81f-cf63a0762747/volumes" Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.879068 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4ws8"] Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.890011 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:45 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:45 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:45 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.890064 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:45 crc kubenswrapper[4792]: W0318 15:37:45.906049 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd261d3_47ac_4082_bc3b_128b9b72df06.slice/crio-98488a92705e850367393dd46943e8b59a23936a5d80a71079a12d93f45b8054 WatchSource:0}: Error finding container 98488a92705e850367393dd46943e8b59a23936a5d80a71079a12d93f45b8054: Status 404 returned error can't find the container with id 98488a92705e850367393dd46943e8b59a23936a5d80a71079a12d93f45b8054 Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.955688 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:45 crc kubenswrapper[4792]: E0318 15:37:45.956121 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.45610386 +0000 UTC m=+215.325432797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:45 crc kubenswrapper[4792]: I0318 15:37:45.957650 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 15:37:45 crc kubenswrapper[4792]: W0318 15:37:45.972744 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8f4ace7c_3804_417a_bbdb_787805432273.slice/crio-996feadc51934be288eda05bac7b0d9d4dcf598083a8e2376f1d1fae2f613588 WatchSource:0}: Error finding container 996feadc51934be288eda05bac7b0d9d4dcf598083a8e2376f1d1fae2f613588: Status 404 returned error can't find the container with id 996feadc51934be288eda05bac7b0d9d4dcf598083a8e2376f1d1fae2f613588 Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.057192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:46 crc kubenswrapper[4792]: E0318 15:37:46.057606 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 15:37:46.557582622 +0000 UTC m=+215.426911559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlsn2" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.078535 4792 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T15:37:45.812701011Z","Handler":null,"Name":""} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.106567 4792 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.106610 4792 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.159082 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.163922 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.260814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.293831 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.293916 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.392908 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlsn2\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.425457 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nztp5"] Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.427031 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.427839 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4"] Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.428464 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.431903 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.432063 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.432684 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.432987 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.433004 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.433240 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.439717 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.442537 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4"] Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.473138 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nztp5"] Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.565451 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-config\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.565612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-serving-cert\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.565672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-catalog-content\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.565762 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6kzk\" (UniqueName: \"kubernetes.io/projected/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-kube-api-access-h6kzk\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.565786 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-utilities\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.565922 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9vck\" (UniqueName: \"kubernetes.io/projected/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-kube-api-access-j9vck\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.565957 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-client-ca\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.580559 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.652566 4792 generic.go:334] "Generic (PLEG): container finished" podID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerID="bdd8003c10399363c6bb26405d3eb81cb892bfedffe91f49486fc11f36500bf6" exitCode=0 Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.652739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j66mt" event={"ID":"b891667e-d9ed-4602-8ef2-b0461e32a955","Type":"ContainerDied","Data":"bdd8003c10399363c6bb26405d3eb81cb892bfedffe91f49486fc11f36500bf6"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.652774 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j66mt" event={"ID":"b891667e-d9ed-4602-8ef2-b0461e32a955","Type":"ContainerStarted","Data":"866e30f0caeabd6de1216e508da8edb23711cc06e1d2d0d13429bf59a37ddd14"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.664955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" event={"ID":"3175cda1-84cb-469d-9f49-028132d324ea","Type":"ContainerStarted","Data":"828c3818264b25f200b9c40d6d4de6e5a8fc3af3a98383a4ecef49f635cc4313"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.665068 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" event={"ID":"3175cda1-84cb-469d-9f49-028132d324ea","Type":"ContainerStarted","Data":"e3faae1224d95fd02633a40a83d2ec2b72c78e27656f100ab27b5295585538cf"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.667448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9vck\" (UniqueName: \"kubernetes.io/projected/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-kube-api-access-j9vck\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.667602 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-client-ca\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.667652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-config\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.669517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-serving-cert\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.669683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-catalog-content\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.670137 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6kzk\" (UniqueName: \"kubernetes.io/projected/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-kube-api-access-h6kzk\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.670167 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-utilities\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.671029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-utilities\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.672891 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-catalog-content\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.674793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f4ace7c-3804-417a-bbdb-787805432273","Type":"ContainerStarted","Data":"30d77ff3640e82e375827a585ff255aafb76dee73db80923fff92e146f26da27"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.674913 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f4ace7c-3804-417a-bbdb-787805432273","Type":"ContainerStarted","Data":"996feadc51934be288eda05bac7b0d9d4dcf598083a8e2376f1d1fae2f613588"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.677442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-config\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.679744 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-client-ca\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.679835 4792 generic.go:334] "Generic (PLEG): container finished" podID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerID="543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899" exitCode=0 Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.679858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps7zj" event={"ID":"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f","Type":"ContainerDied","Data":"543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.686274 4792 generic.go:334] "Generic (PLEG): container finished" podID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerID="c4515a464551556860e70804dbbaf8091ad56d94a555c8bfadaaf4d7dbea10d2" exitCode=0 Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.686474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4ws8" event={"ID":"0bd261d3-47ac-4082-bc3b-128b9b72df06","Type":"ContainerDied","Data":"c4515a464551556860e70804dbbaf8091ad56d94a555c8bfadaaf4d7dbea10d2"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.686515 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4ws8" event={"ID":"0bd261d3-47ac-4082-bc3b-128b9b72df06","Type":"ContainerStarted","Data":"98488a92705e850367393dd46943e8b59a23936a5d80a71079a12d93f45b8054"} Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.702250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6kzk\" (UniqueName: \"kubernetes.io/projected/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-kube-api-access-h6kzk\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.702525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9vck\" (UniqueName: \"kubernetes.io/projected/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-kube-api-access-j9vck\") pod \"redhat-marketplace-nztp5\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.714740 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-serving-cert\") pod \"route-controller-manager-6f4fcb4f7-wbpd4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.756303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.773571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.806425 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-qw2mp" podStartSLOduration=12.806409473 podStartE2EDuration="12.806409473s" podCreationTimestamp="2026-03-18 15:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:46.76134834 +0000 UTC m=+215.630677277" watchObservedRunningTime="2026-03-18 15:37:46.806409473 +0000 UTC m=+215.675738410" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.828592 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.828576226 podStartE2EDuration="2.828576226s" podCreationTimestamp="2026-03-18 15:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:46.828021585 +0000 UTC m=+215.697350522" watchObservedRunningTime="2026-03-18 15:37:46.828576226 +0000 UTC m=+215.697905163" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.834699 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v4lw5"] Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.836243 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.844259 4792 patch_prober.go:28] interesting pod/console-f9d7485db-gs8sw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.844384 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gs8sw" podUID="089a8476-5fcb-4378-8113-4a7162685b16" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.845585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.845616 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.847517 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4lw5"] Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.881323 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlsn2"] Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.890855 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:46 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:46 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:46 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.890920 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.974437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-utilities\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.974527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btc9s\" (UniqueName: \"kubernetes.io/projected/9190b8d3-2c0b-4345-9390-21511bad8701-kube-api-access-btc9s\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:46 crc kubenswrapper[4792]: I0318 15:37:46.974604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-catalog-content\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.077946 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-utilities\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.078027 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btc9s\" (UniqueName: \"kubernetes.io/projected/9190b8d3-2c0b-4345-9390-21511bad8701-kube-api-access-btc9s\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.078070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-catalog-content\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.078590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-catalog-content\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.078592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-utilities\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.089763 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4"] Mar 18 15:37:47 crc kubenswrapper[4792]: W0318 15:37:47.098467 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8fb83a1_1a00_45a4_b62c_77e9b149e6d4.slice/crio-48fd539f9df780d7607de2a425685987a2f8ef5c5538793b0b9b8ee09b8f4923 WatchSource:0}: Error finding container 48fd539f9df780d7607de2a425685987a2f8ef5c5538793b0b9b8ee09b8f4923: Status 404 returned error can't find the container with id 48fd539f9df780d7607de2a425685987a2f8ef5c5538793b0b9b8ee09b8f4923 Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.102210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btc9s\" (UniqueName: \"kubernetes.io/projected/9190b8d3-2c0b-4345-9390-21511bad8701-kube-api-access-btc9s\") pod \"redhat-marketplace-v4lw5\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.167736 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.177831 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.178645 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.184001 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.186355 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.187635 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.282539 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6b84089-fdea-4d7d-a634-56631055fb4f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.282586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6b84089-fdea-4d7d-a634-56631055fb4f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.286942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nztp5"] Mar 18 15:37:47 crc kubenswrapper[4792]: W0318 15:37:47.317137 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d7e9f8_1f62_49fb_a8fc_cc6cd8be7fb6.slice/crio-f87bdaee6cba6ed543cc3a8e9899d321b86161eca0e573e2a33082a4ea32fa2f WatchSource:0}: Error finding container f87bdaee6cba6ed543cc3a8e9899d321b86161eca0e573e2a33082a4ea32fa2f: Status 404 returned error can't find the container with id f87bdaee6cba6ed543cc3a8e9899d321b86161eca0e573e2a33082a4ea32fa2f Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.383406 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6b84089-fdea-4d7d-a634-56631055fb4f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.383455 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6b84089-fdea-4d7d-a634-56631055fb4f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.383617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6b84089-fdea-4d7d-a634-56631055fb4f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.402796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6b84089-fdea-4d7d-a634-56631055fb4f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.609737 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4lw5"] Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.620881 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c58jv"] Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.622012 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.625380 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:37:47 crc kubenswrapper[4792]: W0318 15:37:47.633997 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9190b8d3_2c0b_4345_9390_21511bad8701.slice/crio-395ec634a6b183205d145dbf7ef68691b994f10fd88a9e55ea28d56282ba6ef2 WatchSource:0}: Error finding container 395ec634a6b183205d145dbf7ef68691b994f10fd88a9e55ea28d56282ba6ef2: Status 404 returned error can't find the container with id 395ec634a6b183205d145dbf7ef68691b994f10fd88a9e55ea28d56282ba6ef2 Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.637242 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c58jv"] Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.642334 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.697585 4792 generic.go:334] "Generic (PLEG): container finished" podID="8f4ace7c-3804-417a-bbdb-787805432273" containerID="30d77ff3640e82e375827a585ff255aafb76dee73db80923fff92e146f26da27" exitCode=0 Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.697644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f4ace7c-3804-417a-bbdb-787805432273","Type":"ContainerDied","Data":"30d77ff3640e82e375827a585ff255aafb76dee73db80923fff92e146f26da27"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.702920 4792 generic.go:334] "Generic (PLEG): container finished" podID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerID="8564824b6f02c689be7f97b26755e6d3a066da0d45527d997b0477c11d5459cf" exitCode=0 Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.702989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nztp5" event={"ID":"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6","Type":"ContainerDied","Data":"8564824b6f02c689be7f97b26755e6d3a066da0d45527d997b0477c11d5459cf"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.703014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nztp5" event={"ID":"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6","Type":"ContainerStarted","Data":"f87bdaee6cba6ed543cc3a8e9899d321b86161eca0e573e2a33082a4ea32fa2f"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.706037 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" event={"ID":"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4","Type":"ContainerStarted","Data":"b21a98727dd5a977cb18eedd43ed6a442e9bd6764a6bea3a5d43e53f936ee882"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.706064 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" event={"ID":"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4","Type":"ContainerStarted","Data":"48fd539f9df780d7607de2a425685987a2f8ef5c5538793b0b9b8ee09b8f4923"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.708994 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.711789 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4lw5" event={"ID":"9190b8d3-2c0b-4345-9390-21511bad8701","Type":"ContainerStarted","Data":"395ec634a6b183205d145dbf7ef68691b994f10fd88a9e55ea28d56282ba6ef2"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.734356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" event={"ID":"6b93393b-f935-45f2-9ad1-8a119230b1fa","Type":"ContainerStarted","Data":"65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.734417 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" event={"ID":"6b93393b-f935-45f2-9ad1-8a119230b1fa","Type":"ContainerStarted","Data":"4f623c90af1277cb10ffa13da58e82d92808172f38af31c78b4680848fe32f03"} Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.735097 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.737026 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.762324 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" podStartSLOduration=7.762309299 podStartE2EDuration="7.762309299s" podCreationTimestamp="2026-03-18 15:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:47.743414575 +0000 UTC m=+216.612743512" watchObservedRunningTime="2026-03-18 15:37:47.762309299 +0000 UTC m=+216.631638246" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.787735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-utilities\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.787798 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4r5l\" (UniqueName: \"kubernetes.io/projected/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-kube-api-access-v4r5l\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.787850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-catalog-content\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.813823 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" podStartSLOduration=168.813807777 podStartE2EDuration="2m48.813807777s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:37:47.81277693 +0000 UTC m=+216.682105867" watchObservedRunningTime="2026-03-18 15:37:47.813807777 +0000 UTC m=+216.683136704" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.820565 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.820818 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.834229 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.843464 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.843506 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ltvvk" podUID="01af365e-5f9a-4030-b54e-ebee4cf39552" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.843836 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.843852 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ltvvk" podUID="01af365e-5f9a-4030-b54e-ebee4cf39552" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.864061 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.890032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-catalog-content\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.890119 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-utilities\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.890297 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4r5l\" (UniqueName: \"kubernetes.io/projected/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-kube-api-access-v4r5l\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.896069 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-catalog-content\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.894854 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.898732 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-utilities\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.926984 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:47 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:47 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:47 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.927054 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.936690 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4r5l\" (UniqueName: \"kubernetes.io/projected/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-kube-api-access-v4r5l\") pod \"redhat-operators-c58jv\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:47 crc kubenswrapper[4792]: I0318 15:37:47.954929 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.029725 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fd6pb"] Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.031472 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.040234 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fd6pb"] Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.174422 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.193662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9ph\" (UniqueName: \"kubernetes.io/projected/de3a8b86-9557-46d6-b146-bc402161292c-kube-api-access-bx9ph\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.193739 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-catalog-content\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.193769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-utilities\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.295005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-catalog-content\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.295061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-utilities\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.295375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9ph\" (UniqueName: \"kubernetes.io/projected/de3a8b86-9557-46d6-b146-bc402161292c-kube-api-access-bx9ph\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.296618 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-catalog-content\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.296909 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-utilities\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.341774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9ph\" (UniqueName: \"kubernetes.io/projected/de3a8b86-9557-46d6-b146-bc402161292c-kube-api-access-bx9ph\") pod \"redhat-operators-fd6pb\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.345031 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.561471 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c58jv"] Mar 18 15:37:48 crc kubenswrapper[4792]: W0318 15:37:48.610531 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be7d1e5_fbbe_495a_a2e1_0beb64dacbe2.slice/crio-b329a552ef2528a52184cce25fa713b520bc6cf09fdf1e30e2d988cf39536ad8 WatchSource:0}: Error finding container b329a552ef2528a52184cce25fa713b520bc6cf09fdf1e30e2d988cf39536ad8: Status 404 returned error can't find the container with id b329a552ef2528a52184cce25fa713b520bc6cf09fdf1e30e2d988cf39536ad8 Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.706748 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fd6pb"] Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.759040 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d6b84089-fdea-4d7d-a634-56631055fb4f","Type":"ContainerStarted","Data":"00f2ff20f28516274a0497263f58971a8118e79b19698874a6bce434a3c4296c"} Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.764445 4792 generic.go:334] "Generic (PLEG): container finished" podID="9190b8d3-2c0b-4345-9390-21511bad8701" containerID="44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7" exitCode=0 Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.765288 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4lw5" event={"ID":"9190b8d3-2c0b-4345-9390-21511bad8701","Type":"ContainerDied","Data":"44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7"} Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.771637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c58jv" event={"ID":"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2","Type":"ContainerStarted","Data":"b329a552ef2528a52184cce25fa713b520bc6cf09fdf1e30e2d988cf39536ad8"} Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.784259 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-dkkhx" Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.898587 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:48 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:48 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:48 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:48 crc kubenswrapper[4792]: I0318 15:37:48.898650 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.088290 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.215109 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f4ace7c-3804-417a-bbdb-787805432273-kube-api-access\") pod \"8f4ace7c-3804-417a-bbdb-787805432273\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.216850 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f4ace7c-3804-417a-bbdb-787805432273-kubelet-dir\") pod \"8f4ace7c-3804-417a-bbdb-787805432273\" (UID: \"8f4ace7c-3804-417a-bbdb-787805432273\") " Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.216983 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f4ace7c-3804-417a-bbdb-787805432273-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8f4ace7c-3804-417a-bbdb-787805432273" (UID: "8f4ace7c-3804-417a-bbdb-787805432273"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.217429 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f4ace7c-3804-417a-bbdb-787805432273-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.221983 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4ace7c-3804-417a-bbdb-787805432273-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8f4ace7c-3804-417a-bbdb-787805432273" (UID: "8f4ace7c-3804-417a-bbdb-787805432273"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.318573 4792 ???:1] "http: TLS handshake error from 192.168.126.11:58000: no serving certificate available for the kubelet" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.320034 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f4ace7c-3804-417a-bbdb-787805432273-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.782748 4792 generic.go:334] "Generic (PLEG): container finished" podID="d6b84089-fdea-4d7d-a634-56631055fb4f" containerID="178c6aadad2086324f627707489e19bed35fef5486152c91c4d099867c2a843c" exitCode=0 Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.782807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d6b84089-fdea-4d7d-a634-56631055fb4f","Type":"ContainerDied","Data":"178c6aadad2086324f627707489e19bed35fef5486152c91c4d099867c2a843c"} Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.801863 4792 generic.go:334] "Generic (PLEG): container finished" podID="de3a8b86-9557-46d6-b146-bc402161292c" containerID="a82e6428fb597054018b209982772e46e6797d95ac347940042d3068045e522d" exitCode=0 Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.802437 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6pb" event={"ID":"de3a8b86-9557-46d6-b146-bc402161292c","Type":"ContainerDied","Data":"a82e6428fb597054018b209982772e46e6797d95ac347940042d3068045e522d"} Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.802486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6pb" event={"ID":"de3a8b86-9557-46d6-b146-bc402161292c","Type":"ContainerStarted","Data":"81bd1978217df14fda7d9d4d6b03dc067411d72b342d86101e9cee5b73751d52"} Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.806833 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.807452 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f4ace7c-3804-417a-bbdb-787805432273","Type":"ContainerDied","Data":"996feadc51934be288eda05bac7b0d9d4dcf598083a8e2376f1d1fae2f613588"} Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.807530 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="996feadc51934be288eda05bac7b0d9d4dcf598083a8e2376f1d1fae2f613588" Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.833160 4792 generic.go:334] "Generic (PLEG): container finished" podID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerID="e4a14016c61c520d1ec65b83d319856831e055f3fcae0fd8c808a59198ee0ebf" exitCode=0 Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.833489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c58jv" event={"ID":"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2","Type":"ContainerDied","Data":"e4a14016c61c520d1ec65b83d319856831e055f3fcae0fd8c808a59198ee0ebf"} Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.892220 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 15:37:49 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 18 15:37:49 crc kubenswrapper[4792]: [+]process-running ok Mar 18 15:37:49 crc kubenswrapper[4792]: healthz check failed Mar 18 15:37:49 crc kubenswrapper[4792]: I0318 15:37:49.892563 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.269572 4792 ???:1] "http: TLS handshake error from 192.168.126.11:58014: no serving certificate available for the kubelet" Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.345392 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.351710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f6d7b0a3-b8fe-49f9-91ad-ae46796becbc-metrics-certs\") pod \"network-metrics-daemon-rpvb6\" (UID: \"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc\") " pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.467512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpvb6" Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.846749 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c77956c-88bf-4e94-a8de-a41728753ccd" containerID="1bca90bfe8f505b35b43705bca61006e30650fd20a2a8fc4daa0d3a4946760d4" exitCode=0 Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.846912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" event={"ID":"2c77956c-88bf-4e94-a8de-a41728753ccd","Type":"ContainerDied","Data":"1bca90bfe8f505b35b43705bca61006e30650fd20a2a8fc4daa0d3a4946760d4"} Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.889884 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:50 crc kubenswrapper[4792]: I0318 15:37:50.894859 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tfth7" Mar 18 15:37:53 crc kubenswrapper[4792]: I0318 15:37:53.105819 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h4rq9" Mar 18 15:37:56 crc kubenswrapper[4792]: I0318 15:37:56.866185 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:56 crc kubenswrapper[4792]: I0318 15:37:56.874074 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.366836 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.374121 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.487992 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87szr\" (UniqueName: \"kubernetes.io/projected/2c77956c-88bf-4e94-a8de-a41728753ccd-kube-api-access-87szr\") pod \"2c77956c-88bf-4e94-a8de-a41728753ccd\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.488104 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c77956c-88bf-4e94-a8de-a41728753ccd-secret-volume\") pod \"2c77956c-88bf-4e94-a8de-a41728753ccd\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.488162 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c77956c-88bf-4e94-a8de-a41728753ccd-config-volume\") pod \"2c77956c-88bf-4e94-a8de-a41728753ccd\" (UID: \"2c77956c-88bf-4e94-a8de-a41728753ccd\") " Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.488253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6b84089-fdea-4d7d-a634-56631055fb4f-kube-api-access\") pod \"d6b84089-fdea-4d7d-a634-56631055fb4f\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.488289 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6b84089-fdea-4d7d-a634-56631055fb4f-kubelet-dir\") pod \"d6b84089-fdea-4d7d-a634-56631055fb4f\" (UID: \"d6b84089-fdea-4d7d-a634-56631055fb4f\") " Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.488448 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6b84089-fdea-4d7d-a634-56631055fb4f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d6b84089-fdea-4d7d-a634-56631055fb4f" (UID: "d6b84089-fdea-4d7d-a634-56631055fb4f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.488694 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6b84089-fdea-4d7d-a634-56631055fb4f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.489056 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c77956c-88bf-4e94-a8de-a41728753ccd-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c77956c-88bf-4e94-a8de-a41728753ccd" (UID: "2c77956c-88bf-4e94-a8de-a41728753ccd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.494201 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c77956c-88bf-4e94-a8de-a41728753ccd-kube-api-access-87szr" (OuterVolumeSpecName: "kube-api-access-87szr") pod "2c77956c-88bf-4e94-a8de-a41728753ccd" (UID: "2c77956c-88bf-4e94-a8de-a41728753ccd"). InnerVolumeSpecName "kube-api-access-87szr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.495174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b84089-fdea-4d7d-a634-56631055fb4f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d6b84089-fdea-4d7d-a634-56631055fb4f" (UID: "d6b84089-fdea-4d7d-a634-56631055fb4f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.495765 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c77956c-88bf-4e94-a8de-a41728753ccd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c77956c-88bf-4e94-a8de-a41728753ccd" (UID: "2c77956c-88bf-4e94-a8de-a41728753ccd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.589755 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c77956c-88bf-4e94-a8de-a41728753ccd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.589807 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c77956c-88bf-4e94-a8de-a41728753ccd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.589820 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6b84089-fdea-4d7d-a634-56631055fb4f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.589833 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87szr\" (UniqueName: \"kubernetes.io/projected/2c77956c-88bf-4e94-a8de-a41728753ccd-kube-api-access-87szr\") on node \"crc\" DevicePath \"\"" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.847668 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ltvvk" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.913172 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.913179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v" event={"ID":"2c77956c-88bf-4e94-a8de-a41728753ccd","Type":"ContainerDied","Data":"e03f96735acf3f2c8acdacc34f5cb2057cb2de19b5f125bea3dd53b1e6feaa39"} Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.913874 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e03f96735acf3f2c8acdacc34f5cb2057cb2de19b5f125bea3dd53b1e6feaa39" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.916494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d6b84089-fdea-4d7d-a634-56631055fb4f","Type":"ContainerDied","Data":"00f2ff20f28516274a0497263f58971a8118e79b19698874a6bce434a3c4296c"} Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.916529 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00f2ff20f28516274a0497263f58971a8118e79b19698874a6bce434a3c4296c" Mar 18 15:37:57 crc kubenswrapper[4792]: I0318 15:37:57.916535 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 15:37:59 crc kubenswrapper[4792]: I0318 15:37:59.585755 4792 ???:1] "http: TLS handshake error from 192.168.126.11:57044: no serving certificate available for the kubelet" Mar 18 15:37:59 crc kubenswrapper[4792]: I0318 15:37:59.850319 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d68fbb675-wx2nv"] Mar 18 15:37:59 crc kubenswrapper[4792]: I0318 15:37:59.850516 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" podUID="a2c12f09-78e4-4760-a527-2b90d7a10e99" containerName="controller-manager" containerID="cri-o://138abdbf150a14e65c626425f52c8c8106b4c24e242ac5b6bd08f92a0d0f24af" gracePeriod=30 Mar 18 15:37:59 crc kubenswrapper[4792]: I0318 15:37:59.877838 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4"] Mar 18 15:37:59 crc kubenswrapper[4792]: I0318 15:37:59.878053 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" podUID="b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" containerName="route-controller-manager" containerID="cri-o://b21a98727dd5a977cb18eedd43ed6a442e9bd6764a6bea3a5d43e53f936ee882" gracePeriod=30 Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.125645 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564138-fjs6m"] Mar 18 15:38:00 crc kubenswrapper[4792]: E0318 15:38:00.126565 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4ace7c-3804-417a-bbdb-787805432273" containerName="pruner" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.126579 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4ace7c-3804-417a-bbdb-787805432273" containerName="pruner" Mar 18 15:38:00 crc kubenswrapper[4792]: E0318 15:38:00.126598 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c77956c-88bf-4e94-a8de-a41728753ccd" containerName="collect-profiles" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.126605 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c77956c-88bf-4e94-a8de-a41728753ccd" containerName="collect-profiles" Mar 18 15:38:00 crc kubenswrapper[4792]: E0318 15:38:00.126621 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b84089-fdea-4d7d-a634-56631055fb4f" containerName="pruner" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.126627 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b84089-fdea-4d7d-a634-56631055fb4f" containerName="pruner" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.126905 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c77956c-88bf-4e94-a8de-a41728753ccd" containerName="collect-profiles" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.126915 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b84089-fdea-4d7d-a634-56631055fb4f" containerName="pruner" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.126923 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4ace7c-3804-417a-bbdb-787805432273" containerName="pruner" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.127357 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-fjs6m" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.129295 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.130321 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-fjs6m"] Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.222797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqc5z\" (UniqueName: \"kubernetes.io/projected/53ab1328-f29a-43da-9ed6-8a85635e8808-kube-api-access-sqc5z\") pod \"auto-csr-approver-29564138-fjs6m\" (UID: \"53ab1328-f29a-43da-9ed6-8a85635e8808\") " pod="openshift-infra/auto-csr-approver-29564138-fjs6m" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.322044 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.322118 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.324426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqc5z\" (UniqueName: \"kubernetes.io/projected/53ab1328-f29a-43da-9ed6-8a85635e8808-kube-api-access-sqc5z\") pod \"auto-csr-approver-29564138-fjs6m\" (UID: \"53ab1328-f29a-43da-9ed6-8a85635e8808\") " pod="openshift-infra/auto-csr-approver-29564138-fjs6m" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.341985 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqc5z\" (UniqueName: \"kubernetes.io/projected/53ab1328-f29a-43da-9ed6-8a85635e8808-kube-api-access-sqc5z\") pod \"auto-csr-approver-29564138-fjs6m\" (UID: \"53ab1328-f29a-43da-9ed6-8a85635e8808\") " pod="openshift-infra/auto-csr-approver-29564138-fjs6m" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.450097 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-fjs6m" Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.932540 4792 generic.go:334] "Generic (PLEG): container finished" podID="a2c12f09-78e4-4760-a527-2b90d7a10e99" containerID="138abdbf150a14e65c626425f52c8c8106b4c24e242ac5b6bd08f92a0d0f24af" exitCode=0 Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.932599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" event={"ID":"a2c12f09-78e4-4760-a527-2b90d7a10e99","Type":"ContainerDied","Data":"138abdbf150a14e65c626425f52c8c8106b4c24e242ac5b6bd08f92a0d0f24af"} Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.934024 4792 generic.go:334] "Generic (PLEG): container finished" podID="b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" containerID="b21a98727dd5a977cb18eedd43ed6a442e9bd6764a6bea3a5d43e53f936ee882" exitCode=0 Mar 18 15:38:00 crc kubenswrapper[4792]: I0318 15:38:00.934049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" event={"ID":"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4","Type":"ContainerDied","Data":"b21a98727dd5a977cb18eedd43ed6a442e9bd6764a6bea3a5d43e53f936ee882"} Mar 18 15:38:04 crc kubenswrapper[4792]: E0318 15:38:04.072295 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 15:38:04 crc kubenswrapper[4792]: E0318 15:38:04.072757 4792 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 15:38:04 crc kubenswrapper[4792]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 15:38:04 crc kubenswrapper[4792]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sct6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564136-wt69c_openshift-infra(e094f66b-fe57-429a-b5cd-de6084a8aacb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 15:38:04 crc kubenswrapper[4792]: > logger="UnhandledError" Mar 18 15:38:04 crc kubenswrapper[4792]: E0318 15:38:04.074197 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564136-wt69c" podUID="e094f66b-fe57-429a-b5cd-de6084a8aacb" Mar 18 15:38:04 crc kubenswrapper[4792]: I0318 15:38:04.807729 4792 patch_prober.go:28] interesting pod/controller-manager-7d68fbb675-wx2nv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:38:04 crc kubenswrapper[4792]: I0318 15:38:04.807794 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" podUID="a2c12f09-78e4-4760-a527-2b90d7a10e99" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:38:04 crc kubenswrapper[4792]: E0318 15:38:04.956354 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564136-wt69c" podUID="e094f66b-fe57-429a-b5cd-de6084a8aacb" Mar 18 15:38:06 crc kubenswrapper[4792]: I0318 15:38:06.592791 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:38:07 crc kubenswrapper[4792]: I0318 15:38:07.775367 4792 patch_prober.go:28] interesting pod/route-controller-manager-6f4fcb4f7-wbpd4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 15:38:07 crc kubenswrapper[4792]: I0318 15:38:07.775541 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" podUID="b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.688579 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.697052 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.721682 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc"] Mar 18 15:38:08 crc kubenswrapper[4792]: E0318 15:38:08.721891 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c12f09-78e4-4760-a527-2b90d7a10e99" containerName="controller-manager" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.721904 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c12f09-78e4-4760-a527-2b90d7a10e99" containerName="controller-manager" Mar 18 15:38:08 crc kubenswrapper[4792]: E0318 15:38:08.721915 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" containerName="route-controller-manager" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.721922 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" containerName="route-controller-manager" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.722039 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" containerName="route-controller-manager" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.722051 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c12f09-78e4-4760-a527-2b90d7a10e99" containerName="controller-manager" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.722389 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.731080 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc"] Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834303 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-client-ca\") pod \"a2c12f09-78e4-4760-a527-2b90d7a10e99\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-config\") pod \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834370 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-client-ca\") pod \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834385 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-config\") pod \"a2c12f09-78e4-4760-a527-2b90d7a10e99\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834449 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6kzk\" (UniqueName: \"kubernetes.io/projected/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-kube-api-access-h6kzk\") pod \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4n4g\" (UniqueName: \"kubernetes.io/projected/a2c12f09-78e4-4760-a527-2b90d7a10e99-kube-api-access-t4n4g\") pod \"a2c12f09-78e4-4760-a527-2b90d7a10e99\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834558 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c12f09-78e4-4760-a527-2b90d7a10e99-serving-cert\") pod \"a2c12f09-78e4-4760-a527-2b90d7a10e99\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834585 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-proxy-ca-bundles\") pod \"a2c12f09-78e4-4760-a527-2b90d7a10e99\" (UID: \"a2c12f09-78e4-4760-a527-2b90d7a10e99\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834604 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-serving-cert\") pod \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\" (UID: \"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4\") " Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-config\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90099b5e-8c77-4180-b64b-50779e8e190d-serving-cert\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834827 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqp9\" (UniqueName: \"kubernetes.io/projected/90099b5e-8c77-4180-b64b-50779e8e190d-kube-api-access-glqp9\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.834866 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-client-ca\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.835352 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-config" (OuterVolumeSpecName: "config") pod "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" (UID: "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.835384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" (UID: "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.835406 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-client-ca" (OuterVolumeSpecName: "client-ca") pod "a2c12f09-78e4-4760-a527-2b90d7a10e99" (UID: "a2c12f09-78e4-4760-a527-2b90d7a10e99"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.835696 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a2c12f09-78e4-4760-a527-2b90d7a10e99" (UID: "a2c12f09-78e4-4760-a527-2b90d7a10e99"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.836353 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-config" (OuterVolumeSpecName: "config") pod "a2c12f09-78e4-4760-a527-2b90d7a10e99" (UID: "a2c12f09-78e4-4760-a527-2b90d7a10e99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.840516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" (UID: "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.840511 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-kube-api-access-h6kzk" (OuterVolumeSpecName: "kube-api-access-h6kzk") pod "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" (UID: "b8fb83a1-1a00-45a4-b62c-77e9b149e6d4"). InnerVolumeSpecName "kube-api-access-h6kzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.840679 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c12f09-78e4-4760-a527-2b90d7a10e99-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2c12f09-78e4-4760-a527-2b90d7a10e99" (UID: "a2c12f09-78e4-4760-a527-2b90d7a10e99"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.840829 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c12f09-78e4-4760-a527-2b90d7a10e99-kube-api-access-t4n4g" (OuterVolumeSpecName: "kube-api-access-t4n4g") pod "a2c12f09-78e4-4760-a527-2b90d7a10e99" (UID: "a2c12f09-78e4-4760-a527-2b90d7a10e99"). InnerVolumeSpecName "kube-api-access-t4n4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90099b5e-8c77-4180-b64b-50779e8e190d-serving-cert\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936469 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqp9\" (UniqueName: \"kubernetes.io/projected/90099b5e-8c77-4180-b64b-50779e8e190d-kube-api-access-glqp9\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936524 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-client-ca\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936575 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-config\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936622 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6kzk\" (UniqueName: \"kubernetes.io/projected/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-kube-api-access-h6kzk\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936635 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4n4g\" (UniqueName: \"kubernetes.io/projected/a2c12f09-78e4-4760-a527-2b90d7a10e99-kube-api-access-t4n4g\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936649 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c12f09-78e4-4760-a527-2b90d7a10e99-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936661 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936671 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936682 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936692 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936702 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.936713 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2c12f09-78e4-4760-a527-2b90d7a10e99-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.937903 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-client-ca\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.938222 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-config\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.941070 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90099b5e-8c77-4180-b64b-50779e8e190d-serving-cert\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.956698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqp9\" (UniqueName: \"kubernetes.io/projected/90099b5e-8c77-4180-b64b-50779e8e190d-kube-api-access-glqp9\") pod \"route-controller-manager-675465784d-k5nwc\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.981091 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" event={"ID":"b8fb83a1-1a00-45a4-b62c-77e9b149e6d4","Type":"ContainerDied","Data":"48fd539f9df780d7607de2a425685987a2f8ef5c5538793b0b9b8ee09b8f4923"} Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.981168 4792 scope.go:117] "RemoveContainer" containerID="b21a98727dd5a977cb18eedd43ed6a442e9bd6764a6bea3a5d43e53f936ee882" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.981107 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4" Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.983233 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" event={"ID":"a2c12f09-78e4-4760-a527-2b90d7a10e99","Type":"ContainerDied","Data":"b409e00f06ea2019c33dfccebbacd46a405968788afba5ab1795a81243161af3"} Mar 18 15:38:08 crc kubenswrapper[4792]: I0318 15:38:08.983295 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d68fbb675-wx2nv" Mar 18 15:38:09 crc kubenswrapper[4792]: I0318 15:38:09.029215 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d68fbb675-wx2nv"] Mar 18 15:38:09 crc kubenswrapper[4792]: I0318 15:38:09.037556 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d68fbb675-wx2nv"] Mar 18 15:38:09 crc kubenswrapper[4792]: I0318 15:38:09.039572 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:09 crc kubenswrapper[4792]: I0318 15:38:09.042876 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4"] Mar 18 15:38:09 crc kubenswrapper[4792]: I0318 15:38:09.046820 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f4fcb4f7-wbpd4"] Mar 18 15:38:09 crc kubenswrapper[4792]: I0318 15:38:09.862770 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c12f09-78e4-4760-a527-2b90d7a10e99" path="/var/lib/kubelet/pods/a2c12f09-78e4-4760-a527-2b90d7a10e99/volumes" Mar 18 15:38:09 crc kubenswrapper[4792]: I0318 15:38:09.864253 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fb83a1-1a00-45a4-b62c-77e9b149e6d4" path="/var/lib/kubelet/pods/b8fb83a1-1a00-45a4-b62c-77e9b149e6d4/volumes" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.442426 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq"] Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.443295 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.445588 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.446322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.446825 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.446961 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.447111 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.447247 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.456821 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq"] Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.458434 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.577486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-config\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.577582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-serving-cert\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.577617 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-proxy-ca-bundles\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.577650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f8zj\" (UniqueName: \"kubernetes.io/projected/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-kube-api-access-2f8zj\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.577691 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-client-ca\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.678907 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-client-ca\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.679030 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-config\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.679106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-serving-cert\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.679142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-proxy-ca-bundles\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.679172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f8zj\" (UniqueName: \"kubernetes.io/projected/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-kube-api-access-2f8zj\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.680042 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-client-ca\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.680229 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-proxy-ca-bundles\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.682540 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-config\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.687079 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-serving-cert\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.698679 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f8zj\" (UniqueName: \"kubernetes.io/projected/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-kube-api-access-2f8zj\") pod \"controller-manager-c4b5bb5ff-2lrqq\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:11 crc kubenswrapper[4792]: I0318 15:38:11.762898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:13 crc kubenswrapper[4792]: E0318 15:38:13.531076 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 15:38:13 crc kubenswrapper[4792]: E0318 15:38:13.531837 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpmfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jqrw8_openshift-marketplace(8ba56c2f-0c1b-4201-9960-590b2cb73fb6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:13 crc kubenswrapper[4792]: E0318 15:38:13.533096 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jqrw8" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.125684 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 15:38:18 crc kubenswrapper[4792]: E0318 15:38:18.126523 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jqrw8" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.299227 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" Mar 18 15:38:18 crc kubenswrapper[4792]: E0318 15:38:18.299530 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 15:38:18 crc kubenswrapper[4792]: E0318 15:38:18.299746 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j9vck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nztp5_openshift-marketplace(50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:18 crc kubenswrapper[4792]: E0318 15:38:18.300984 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nztp5" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.621473 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.622206 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.626416 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.626691 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.627873 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.768723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c553e7c-a913-4865-ad91-83b3305a6789-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.768822 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c553e7c-a913-4865-ad91-83b3305a6789-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.869572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c553e7c-a913-4865-ad91-83b3305a6789-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.869634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c553e7c-a913-4865-ad91-83b3305a6789-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.869755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c553e7c-a913-4865-ad91-83b3305a6789-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.900508 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c553e7c-a913-4865-ad91-83b3305a6789-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:18 crc kubenswrapper[4792]: I0318 15:38:18.949506 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:19 crc kubenswrapper[4792]: I0318 15:38:19.831166 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq"] Mar 18 15:38:19 crc kubenswrapper[4792]: I0318 15:38:19.927462 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc"] Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.527453 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nztp5" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.637163 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.637562 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z7c8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-j66mt_openshift-marketplace(b891667e-d9ed-4602-8ef2-b0461e32a955): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.638785 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-j66mt" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.642115 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.642268 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btc9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-v4lw5_openshift-marketplace(9190b8d3-2c0b-4345-9390-21511bad8701): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.643626 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-v4lw5" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.662455 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.662640 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-76n4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-b4ws8_openshift-marketplace(0bd261d3-47ac-4082-bc3b-128b9b72df06): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.663915 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-b4ws8" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.677232 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.677406 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ssjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ps7zj_openshift-marketplace(2ab5703e-d7a2-4839-b0d5-2851a53b5f6f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:21 crc kubenswrapper[4792]: E0318 15:38:21.678901 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ps7zj" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" Mar 18 15:38:21 crc kubenswrapper[4792]: I0318 15:38:21.745216 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rpvb6"] Mar 18 15:38:22 crc kubenswrapper[4792]: I0318 15:38:22.997116 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:38:22 crc kubenswrapper[4792]: I0318 15:38:22.998089 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.008513 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.131085 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb059e12-7b45-4382-a603-a79a9261d608-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.131171 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.131200 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-var-lock\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.232202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb059e12-7b45-4382-a603-a79a9261d608-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.232286 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-var-lock\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.232312 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.232397 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-var-lock\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.232418 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.257245 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb059e12-7b45-4382-a603-a79a9261d608-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:23 crc kubenswrapper[4792]: I0318 15:38:23.337591 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:38:25 crc kubenswrapper[4792]: E0318 15:38:25.564330 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ps7zj" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" Mar 18 15:38:25 crc kubenswrapper[4792]: E0318 15:38:25.564409 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-b4ws8" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" Mar 18 15:38:25 crc kubenswrapper[4792]: E0318 15:38:25.564446 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-v4lw5" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" Mar 18 15:38:25 crc kubenswrapper[4792]: E0318 15:38:25.565027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-j66mt" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" Mar 18 15:38:25 crc kubenswrapper[4792]: I0318 15:38:25.593506 4792 scope.go:117] "RemoveContainer" containerID="138abdbf150a14e65c626425f52c8c8106b4c24e242ac5b6bd08f92a0d0f24af" Mar 18 15:38:25 crc kubenswrapper[4792]: W0318 15:38:25.613230 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6d7b0a3_b8fe_49f9_91ad_ae46796becbc.slice/crio-fd828f0d2a2473b92e7d3ab8a31bc6b99f05071d1eb507fdc28fda56312ed5cd WatchSource:0}: Error finding container fd828f0d2a2473b92e7d3ab8a31bc6b99f05071d1eb507fdc28fda56312ed5cd: Status 404 returned error can't find the container with id fd828f0d2a2473b92e7d3ab8a31bc6b99f05071d1eb507fdc28fda56312ed5cd Mar 18 15:38:25 crc kubenswrapper[4792]: I0318 15:38:25.746428 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-fjs6m"] Mar 18 15:38:25 crc kubenswrapper[4792]: W0318 15:38:25.752494 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53ab1328_f29a_43da_9ed6_8a85635e8808.slice/crio-c4adae7d1036ec94518141a80262f0ccc568fec13bf253a33411d3587447185e WatchSource:0}: Error finding container c4adae7d1036ec94518141a80262f0ccc568fec13bf253a33411d3587447185e: Status 404 returned error can't find the container with id c4adae7d1036ec94518141a80262f0ccc568fec13bf253a33411d3587447185e Mar 18 15:38:25 crc kubenswrapper[4792]: E0318 15:38:25.756716 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 15:38:25 crc kubenswrapper[4792]: E0318 15:38:25.756841 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bx9ph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fd6pb_openshift-marketplace(de3a8b86-9557-46d6-b146-bc402161292c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:25 crc kubenswrapper[4792]: E0318 15:38:25.759164 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fd6pb" podUID="de3a8b86-9557-46d6-b146-bc402161292c" Mar 18 15:38:25 crc kubenswrapper[4792]: I0318 15:38:25.836457 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq"] Mar 18 15:38:25 crc kubenswrapper[4792]: W0318 15:38:25.851290 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ba13fbb_a1e1_45f6_b18a_3699e5eafa2a.slice/crio-004508ffd19480fd90784bb2ab5ccd98ae088489d95588f2a4300004a2c3e5dd WatchSource:0}: Error finding container 004508ffd19480fd90784bb2ab5ccd98ae088489d95588f2a4300004a2c3e5dd: Status 404 returned error can't find the container with id 004508ffd19480fd90784bb2ab5ccd98ae088489d95588f2a4300004a2c3e5dd Mar 18 15:38:25 crc kubenswrapper[4792]: I0318 15:38:25.914472 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 15:38:25 crc kubenswrapper[4792]: W0318 15:38:25.924124 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9c553e7c_a913_4865_ad91_83b3305a6789.slice/crio-2bc0f3194082864f8de2e8013b19ba091f569dad44e8c01409697523e982eefe WatchSource:0}: Error finding container 2bc0f3194082864f8de2e8013b19ba091f569dad44e8c01409697523e982eefe: Status 404 returned error can't find the container with id 2bc0f3194082864f8de2e8013b19ba091f569dad44e8c01409697523e982eefe Mar 18 15:38:25 crc kubenswrapper[4792]: I0318 15:38:25.954852 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 15:38:25 crc kubenswrapper[4792]: W0318 15:38:25.959532 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcb059e12_7b45_4382_a603_a79a9261d608.slice/crio-5b42605c9b996cba7ec5ee5822b5a8577090f1c50518b2e7dd733c56a9cd8ade WatchSource:0}: Error finding container 5b42605c9b996cba7ec5ee5822b5a8577090f1c50518b2e7dd733c56a9cd8ade: Status 404 returned error can't find the container with id 5b42605c9b996cba7ec5ee5822b5a8577090f1c50518b2e7dd733c56a9cd8ade Mar 18 15:38:26 crc kubenswrapper[4792]: I0318 15:38:26.075051 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc"] Mar 18 15:38:26 crc kubenswrapper[4792]: I0318 15:38:26.086408 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" event={"ID":"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc","Type":"ContainerStarted","Data":"fd828f0d2a2473b92e7d3ab8a31bc6b99f05071d1eb507fdc28fda56312ed5cd"} Mar 18 15:38:26 crc kubenswrapper[4792]: W0318 15:38:26.089809 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90099b5e_8c77_4180_b64b_50779e8e190d.slice/crio-b55cec81e660e5d5fb5b48faa52c20971b576266a084d5241ee8370214af206c WatchSource:0}: Error finding container b55cec81e660e5d5fb5b48faa52c20971b576266a084d5241ee8370214af206c: Status 404 returned error can't find the container with id b55cec81e660e5d5fb5b48faa52c20971b576266a084d5241ee8370214af206c Mar 18 15:38:26 crc kubenswrapper[4792]: I0318 15:38:26.094255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" event={"ID":"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a","Type":"ContainerStarted","Data":"004508ffd19480fd90784bb2ab5ccd98ae088489d95588f2a4300004a2c3e5dd"} Mar 18 15:38:26 crc kubenswrapper[4792]: I0318 15:38:26.096459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564138-fjs6m" event={"ID":"53ab1328-f29a-43da-9ed6-8a85635e8808","Type":"ContainerStarted","Data":"c4adae7d1036ec94518141a80262f0ccc568fec13bf253a33411d3587447185e"} Mar 18 15:38:26 crc kubenswrapper[4792]: I0318 15:38:26.097502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb059e12-7b45-4382-a603-a79a9261d608","Type":"ContainerStarted","Data":"5b42605c9b996cba7ec5ee5822b5a8577090f1c50518b2e7dd733c56a9cd8ade"} Mar 18 15:38:26 crc kubenswrapper[4792]: I0318 15:38:26.098179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9c553e7c-a913-4865-ad91-83b3305a6789","Type":"ContainerStarted","Data":"2bc0f3194082864f8de2e8013b19ba091f569dad44e8c01409697523e982eefe"} Mar 18 15:38:26 crc kubenswrapper[4792]: E0318 15:38:26.109512 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fd6pb" podUID="de3a8b86-9557-46d6-b146-bc402161292c" Mar 18 15:38:26 crc kubenswrapper[4792]: E0318 15:38:26.229504 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 15:38:26 crc kubenswrapper[4792]: E0318 15:38:26.229737 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4r5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c58jv_openshift-marketplace(6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 15:38:26 crc kubenswrapper[4792]: E0318 15:38:26.230951 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c58jv" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.108157 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb059e12-7b45-4382-a603-a79a9261d608","Type":"ContainerStarted","Data":"3be7f9c354fac75524479c7a64522049921269f33c78f80163d66992409e7cf3"} Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.113727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" event={"ID":"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc","Type":"ContainerStarted","Data":"8c2cb4fe7af84a553ad0f4fd4086dec507b5242aa595d8f3ab46ddf2cc8fcb54"} Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.113783 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpvb6" event={"ID":"f6d7b0a3-b8fe-49f9-91ad-ae46796becbc","Type":"ContainerStarted","Data":"a1aed973a4a92ba47c9fa18f7941f2a14ab323dd38816a3b8ab92435a81125f1"} Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.118770 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" podUID="90099b5e-8c77-4180-b64b-50779e8e190d" containerName="route-controller-manager" containerID="cri-o://20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e" gracePeriod=30 Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.118787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" event={"ID":"90099b5e-8c77-4180-b64b-50779e8e190d","Type":"ContainerStarted","Data":"20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e"} Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.118898 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.118913 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" event={"ID":"90099b5e-8c77-4180-b64b-50779e8e190d","Type":"ContainerStarted","Data":"b55cec81e660e5d5fb5b48faa52c20971b576266a084d5241ee8370214af206c"} Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.128339 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.128319648 podStartE2EDuration="5.128319648s" podCreationTimestamp="2026-03-18 15:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.124826261 +0000 UTC m=+255.994155208" watchObservedRunningTime="2026-03-18 15:38:27.128319648 +0000 UTC m=+255.997648595" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.134874 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.137345 4792 generic.go:334] "Generic (PLEG): container finished" podID="9c553e7c-a913-4865-ad91-83b3305a6789" containerID="a85740df42cb172583a212e64ad4665fddbb78c52e032f45227e5d652baa7b07" exitCode=0 Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.137445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9c553e7c-a913-4865-ad91-83b3305a6789","Type":"ContainerDied","Data":"a85740df42cb172583a212e64ad4665fddbb78c52e032f45227e5d652baa7b07"} Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.140369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" event={"ID":"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a","Type":"ContainerStarted","Data":"26462f84fa7ecd195151792eac1fe2cad1714dd8e2454d05a5c79c089f37edf2"} Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.140622 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" podUID="7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" containerName="controller-manager" containerID="cri-o://26462f84fa7ecd195151792eac1fe2cad1714dd8e2454d05a5c79c089f37edf2" gracePeriod=30 Mar 18 15:38:27 crc kubenswrapper[4792]: E0318 15:38:27.146343 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c58jv" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.176047 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rpvb6" podStartSLOduration=208.176024138 podStartE2EDuration="3m28.176024138s" podCreationTimestamp="2026-03-18 15:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.175151496 +0000 UTC m=+256.044480433" watchObservedRunningTime="2026-03-18 15:38:27.176024138 +0000 UTC m=+256.045353075" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.176478 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" podStartSLOduration=28.176471024 podStartE2EDuration="28.176471024s" podCreationTimestamp="2026-03-18 15:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.152799677 +0000 UTC m=+256.022128624" watchObservedRunningTime="2026-03-18 15:38:27.176471024 +0000 UTC m=+256.045799961" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.248498 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" podStartSLOduration=28.248475945 podStartE2EDuration="28.248475945s" podCreationTimestamp="2026-03-18 15:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:27.246263034 +0000 UTC m=+256.115591991" watchObservedRunningTime="2026-03-18 15:38:27.248475945 +0000 UTC m=+256.117804882" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.611577 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.645033 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr"] Mar 18 15:38:27 crc kubenswrapper[4792]: E0318 15:38:27.645273 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90099b5e-8c77-4180-b64b-50779e8e190d" containerName="route-controller-manager" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.645286 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="90099b5e-8c77-4180-b64b-50779e8e190d" containerName="route-controller-manager" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.645399 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="90099b5e-8c77-4180-b64b-50779e8e190d" containerName="route-controller-manager" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.645770 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.656234 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr"] Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.698270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90099b5e-8c77-4180-b64b-50779e8e190d-serving-cert\") pod \"90099b5e-8c77-4180-b64b-50779e8e190d\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.698591 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-client-ca\") pod \"90099b5e-8c77-4180-b64b-50779e8e190d\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.699349 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-client-ca" (OuterVolumeSpecName: "client-ca") pod "90099b5e-8c77-4180-b64b-50779e8e190d" (UID: "90099b5e-8c77-4180-b64b-50779e8e190d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.699408 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-config\") pod \"90099b5e-8c77-4180-b64b-50779e8e190d\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.699579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glqp9\" (UniqueName: \"kubernetes.io/projected/90099b5e-8c77-4180-b64b-50779e8e190d-kube-api-access-glqp9\") pod \"90099b5e-8c77-4180-b64b-50779e8e190d\" (UID: \"90099b5e-8c77-4180-b64b-50779e8e190d\") " Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.700156 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.700577 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-config" (OuterVolumeSpecName: "config") pod "90099b5e-8c77-4180-b64b-50779e8e190d" (UID: "90099b5e-8c77-4180-b64b-50779e8e190d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.704030 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90099b5e-8c77-4180-b64b-50779e8e190d-kube-api-access-glqp9" (OuterVolumeSpecName: "kube-api-access-glqp9") pod "90099b5e-8c77-4180-b64b-50779e8e190d" (UID: "90099b5e-8c77-4180-b64b-50779e8e190d"). InnerVolumeSpecName "kube-api-access-glqp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.712696 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90099b5e-8c77-4180-b64b-50779e8e190d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "90099b5e-8c77-4180-b64b-50779e8e190d" (UID: "90099b5e-8c77-4180-b64b-50779e8e190d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.766242 4792 csr.go:261] certificate signing request csr-fdc5r is approved, waiting to be issued Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.770822 4792 csr.go:257] certificate signing request csr-fdc5r is issued Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.801633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-config\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.801807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-client-ca\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.801927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84rjk\" (UniqueName: \"kubernetes.io/projected/f7e0560a-ec5f-4c71-bec8-c69aa8321608-kube-api-access-84rjk\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.802082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7e0560a-ec5f-4c71-bec8-c69aa8321608-serving-cert\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.802200 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90099b5e-8c77-4180-b64b-50779e8e190d-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.802280 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glqp9\" (UniqueName: \"kubernetes.io/projected/90099b5e-8c77-4180-b64b-50779e8e190d-kube-api-access-glqp9\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.802360 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90099b5e-8c77-4180-b64b-50779e8e190d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.903884 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-config\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.904260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-client-ca\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.904285 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84rjk\" (UniqueName: \"kubernetes.io/projected/f7e0560a-ec5f-4c71-bec8-c69aa8321608-kube-api-access-84rjk\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.904323 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7e0560a-ec5f-4c71-bec8-c69aa8321608-serving-cert\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.905790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-client-ca\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.906479 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-config\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.911795 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7e0560a-ec5f-4c71-bec8-c69aa8321608-serving-cert\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.924628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84rjk\" (UniqueName: \"kubernetes.io/projected/f7e0560a-ec5f-4c71-bec8-c69aa8321608-kube-api-access-84rjk\") pod \"route-controller-manager-8844c8d8f-hf8mr\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:27 crc kubenswrapper[4792]: I0318 15:38:27.973443 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.149897 4792 generic.go:334] "Generic (PLEG): container finished" podID="7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" containerID="26462f84fa7ecd195151792eac1fe2cad1714dd8e2454d05a5c79c089f37edf2" exitCode=0 Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.150009 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" event={"ID":"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a","Type":"ContainerDied","Data":"26462f84fa7ecd195151792eac1fe2cad1714dd8e2454d05a5c79c089f37edf2"} Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.153480 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564136-wt69c" event={"ID":"e094f66b-fe57-429a-b5cd-de6084a8aacb","Type":"ContainerStarted","Data":"ac0224c69f699d2cbdf0f640ce079bf0e89f74eaa87c15e9a094a39721d510d9"} Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.156385 4792 generic.go:334] "Generic (PLEG): container finished" podID="90099b5e-8c77-4180-b64b-50779e8e190d" containerID="20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e" exitCode=0 Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.157107 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.157658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" event={"ID":"90099b5e-8c77-4180-b64b-50779e8e190d","Type":"ContainerDied","Data":"20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e"} Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.157784 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc" event={"ID":"90099b5e-8c77-4180-b64b-50779e8e190d","Type":"ContainerDied","Data":"b55cec81e660e5d5fb5b48faa52c20971b576266a084d5241ee8370214af206c"} Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.157909 4792 scope.go:117] "RemoveContainer" containerID="20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.184172 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564136-wt69c" podStartSLOduration=102.027308068 podStartE2EDuration="2m28.184150159s" podCreationTimestamp="2026-03-18 15:36:00 +0000 UTC" firstStartedPulling="2026-03-18 15:37:40.77656388 +0000 UTC m=+209.645892807" lastFinishedPulling="2026-03-18 15:38:26.93340596 +0000 UTC m=+255.802734898" observedRunningTime="2026-03-18 15:38:28.177207585 +0000 UTC m=+257.046536542" watchObservedRunningTime="2026-03-18 15:38:28.184150159 +0000 UTC m=+257.053479096" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.197374 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc"] Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.201061 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675465784d-k5nwc"] Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.444862 4792 scope.go:117] "RemoveContainer" containerID="20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e" Mar 18 15:38:28 crc kubenswrapper[4792]: E0318 15:38:28.445526 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e\": container with ID starting with 20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e not found: ID does not exist" containerID="20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.445558 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e"} err="failed to get container status \"20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e\": rpc error: code = NotFound desc = could not find container \"20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e\": container with ID starting with 20434577f61fd37ccc32fe9e36dd14e34ac764e207576163b5da2866cc97411e not found: ID does not exist" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.469835 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.600022 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.629703 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c553e7c-a913-4865-ad91-83b3305a6789-kubelet-dir\") pod \"9c553e7c-a913-4865-ad91-83b3305a6789\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.629776 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c553e7c-a913-4865-ad91-83b3305a6789-kube-api-access\") pod \"9c553e7c-a913-4865-ad91-83b3305a6789\" (UID: \"9c553e7c-a913-4865-ad91-83b3305a6789\") " Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.629806 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c553e7c-a913-4865-ad91-83b3305a6789-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c553e7c-a913-4865-ad91-83b3305a6789" (UID: "9c553e7c-a913-4865-ad91-83b3305a6789"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.630076 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c553e7c-a913-4865-ad91-83b3305a6789-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.635805 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c553e7c-a913-4865-ad91-83b3305a6789-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c553e7c-a913-4865-ad91-83b3305a6789" (UID: "9c553e7c-a913-4865-ad91-83b3305a6789"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.730710 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-serving-cert\") pod \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.731207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f8zj\" (UniqueName: \"kubernetes.io/projected/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-kube-api-access-2f8zj\") pod \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.731288 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-config\") pod \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.731352 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-proxy-ca-bundles\") pod \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.731391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-client-ca\") pod \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\" (UID: \"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a\") " Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.731671 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c553e7c-a913-4865-ad91-83b3305a6789-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.733765 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" (UID: "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.733839 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-config" (OuterVolumeSpecName: "config") pod "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" (UID: "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.734170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" (UID: "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.736047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" (UID: "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.741228 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-kube-api-access-2f8zj" (OuterVolumeSpecName: "kube-api-access-2f8zj") pod "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" (UID: "7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a"). InnerVolumeSpecName "kube-api-access-2f8zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.771979 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 20:53:01.128413834 +0000 UTC Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.772023 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6365h14m32.356393586s for next certificate rotation Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.805405 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr"] Mar 18 15:38:28 crc kubenswrapper[4792]: W0318 15:38:28.820589 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e0560a_ec5f_4c71_bec8_c69aa8321608.slice/crio-c69548edbbe31d1b4829bc695d1b6cc292c8141b666fc796e80a44428509d221 WatchSource:0}: Error finding container c69548edbbe31d1b4829bc695d1b6cc292c8141b666fc796e80a44428509d221: Status 404 returned error can't find the container with id c69548edbbe31d1b4829bc695d1b6cc292c8141b666fc796e80a44428509d221 Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.832447 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.832474 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.832484 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.832493 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f8zj\" (UniqueName: \"kubernetes.io/projected/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-kube-api-access-2f8zj\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:28 crc kubenswrapper[4792]: I0318 15:38:28.832501 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.163279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" event={"ID":"f7e0560a-ec5f-4c71-bec8-c69aa8321608","Type":"ContainerStarted","Data":"8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60"} Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.163537 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" event={"ID":"f7e0560a-ec5f-4c71-bec8-c69aa8321608","Type":"ContainerStarted","Data":"c69548edbbe31d1b4829bc695d1b6cc292c8141b666fc796e80a44428509d221"} Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.164621 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.165932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9c553e7c-a913-4865-ad91-83b3305a6789","Type":"ContainerDied","Data":"2bc0f3194082864f8de2e8013b19ba091f569dad44e8c01409697523e982eefe"} Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.165956 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc0f3194082864f8de2e8013b19ba091f569dad44e8c01409697523e982eefe" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.166004 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.171590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" event={"ID":"7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a","Type":"ContainerDied","Data":"004508ffd19480fd90784bb2ab5ccd98ae088489d95588f2a4300004a2c3e5dd"} Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.171644 4792 scope.go:117] "RemoveContainer" containerID="26462f84fa7ecd195151792eac1fe2cad1714dd8e2454d05a5c79c089f37edf2" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.171734 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.180764 4792 generic.go:334] "Generic (PLEG): container finished" podID="e094f66b-fe57-429a-b5cd-de6084a8aacb" containerID="ac0224c69f699d2cbdf0f640ce079bf0e89f74eaa87c15e9a094a39721d510d9" exitCode=0 Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.180849 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564136-wt69c" event={"ID":"e094f66b-fe57-429a-b5cd-de6084a8aacb","Type":"ContainerDied","Data":"ac0224c69f699d2cbdf0f640ce079bf0e89f74eaa87c15e9a094a39721d510d9"} Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.183934 4792 generic.go:334] "Generic (PLEG): container finished" podID="53ab1328-f29a-43da-9ed6-8a85635e8808" containerID="fe197a4474095149db460a6ac9ce05e600a95eca3b713cc6e9e79f5c8c379abd" exitCode=0 Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.184287 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564138-fjs6m" event={"ID":"53ab1328-f29a-43da-9ed6-8a85635e8808","Type":"ContainerDied","Data":"fe197a4474095149db460a6ac9ce05e600a95eca3b713cc6e9e79f5c8c379abd"} Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.202750 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" podStartSLOduration=10.202727943 podStartE2EDuration="10.202727943s" podCreationTimestamp="2026-03-18 15:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:29.196873208 +0000 UTC m=+258.066202165" watchObservedRunningTime="2026-03-18 15:38:29.202727943 +0000 UTC m=+258.072056890" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.213790 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq"] Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.213846 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c4b5bb5ff-2lrqq"] Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.254282 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.772284 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 03:37:13.40357851 +0000 UTC Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.772342 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6827h58m43.631241707s for next certificate rotation Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.863840 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" path="/var/lib/kubelet/pods/7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a/volumes" Mar 18 15:38:29 crc kubenswrapper[4792]: I0318 15:38:29.864707 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90099b5e-8c77-4180-b64b-50779e8e190d" path="/var/lib/kubelet/pods/90099b5e-8c77-4180-b64b-50779e8e190d/volumes" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.321851 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.322272 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.322339 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.323262 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.323345 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8" gracePeriod=600 Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.418184 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564136-wt69c" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.423283 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-fjs6m" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.466996 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bf4c887f-mdlxm"] Mar 18 15:38:30 crc kubenswrapper[4792]: E0318 15:38:30.467251 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" containerName="controller-manager" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467271 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" containerName="controller-manager" Mar 18 15:38:30 crc kubenswrapper[4792]: E0318 15:38:30.467282 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c553e7c-a913-4865-ad91-83b3305a6789" containerName="pruner" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467290 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c553e7c-a913-4865-ad91-83b3305a6789" containerName="pruner" Mar 18 15:38:30 crc kubenswrapper[4792]: E0318 15:38:30.467307 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e094f66b-fe57-429a-b5cd-de6084a8aacb" containerName="oc" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467315 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e094f66b-fe57-429a-b5cd-de6084a8aacb" containerName="oc" Mar 18 15:38:30 crc kubenswrapper[4792]: E0318 15:38:30.467328 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ab1328-f29a-43da-9ed6-8a85635e8808" containerName="oc" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467335 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ab1328-f29a-43da-9ed6-8a85635e8808" containerName="oc" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467459 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c553e7c-a913-4865-ad91-83b3305a6789" containerName="pruner" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467475 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba13fbb-a1e1-45f6-b18a-3699e5eafa2a" containerName="controller-manager" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467486 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ab1328-f29a-43da-9ed6-8a85635e8808" containerName="oc" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467500 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e094f66b-fe57-429a-b5cd-de6084a8aacb" containerName="oc" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.467999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.470382 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.471374 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.471716 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.471857 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.472415 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.472580 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.476867 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bf4c887f-mdlxm"] Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.486806 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.556864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sct6n\" (UniqueName: \"kubernetes.io/projected/e094f66b-fe57-429a-b5cd-de6084a8aacb-kube-api-access-sct6n\") pod \"e094f66b-fe57-429a-b5cd-de6084a8aacb\" (UID: \"e094f66b-fe57-429a-b5cd-de6084a8aacb\") " Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.557210 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqc5z\" (UniqueName: \"kubernetes.io/projected/53ab1328-f29a-43da-9ed6-8a85635e8808-kube-api-access-sqc5z\") pod \"53ab1328-f29a-43da-9ed6-8a85635e8808\" (UID: \"53ab1328-f29a-43da-9ed6-8a85635e8808\") " Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.557413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-proxy-ca-bundles\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.557478 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw89p\" (UniqueName: \"kubernetes.io/projected/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-kube-api-access-hw89p\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.557533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-client-ca\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.557589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-serving-cert\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.557662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-config\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.562070 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e094f66b-fe57-429a-b5cd-de6084a8aacb-kube-api-access-sct6n" (OuterVolumeSpecName: "kube-api-access-sct6n") pod "e094f66b-fe57-429a-b5cd-de6084a8aacb" (UID: "e094f66b-fe57-429a-b5cd-de6084a8aacb"). InnerVolumeSpecName "kube-api-access-sct6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.562113 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ab1328-f29a-43da-9ed6-8a85635e8808-kube-api-access-sqc5z" (OuterVolumeSpecName: "kube-api-access-sqc5z") pod "53ab1328-f29a-43da-9ed6-8a85635e8808" (UID: "53ab1328-f29a-43da-9ed6-8a85635e8808"). InnerVolumeSpecName "kube-api-access-sqc5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.658428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-config\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.658506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-proxy-ca-bundles\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.658555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw89p\" (UniqueName: \"kubernetes.io/projected/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-kube-api-access-hw89p\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.658604 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-client-ca\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.658650 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-serving-cert\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.658720 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sct6n\" (UniqueName: \"kubernetes.io/projected/e094f66b-fe57-429a-b5cd-de6084a8aacb-kube-api-access-sct6n\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.658736 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqc5z\" (UniqueName: \"kubernetes.io/projected/53ab1328-f29a-43da-9ed6-8a85635e8808-kube-api-access-sqc5z\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.660043 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-client-ca\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.660059 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-config\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.660908 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-proxy-ca-bundles\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.663575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-serving-cert\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.675595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw89p\" (UniqueName: \"kubernetes.io/projected/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-kube-api-access-hw89p\") pod \"controller-manager-6bf4c887f-mdlxm\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:30 crc kubenswrapper[4792]: I0318 15:38:30.809278 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.203656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564136-wt69c" event={"ID":"e094f66b-fe57-429a-b5cd-de6084a8aacb","Type":"ContainerDied","Data":"3801e39f6838dd3a2128ee9d248b3994b469121df89f4c0b44757184d8fd1dd8"} Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.204192 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3801e39f6838dd3a2128ee9d248b3994b469121df89f4c0b44757184d8fd1dd8" Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.204279 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564136-wt69c" Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.210649 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564138-fjs6m" Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.211126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564138-fjs6m" event={"ID":"53ab1328-f29a-43da-9ed6-8a85635e8808","Type":"ContainerDied","Data":"c4adae7d1036ec94518141a80262f0ccc568fec13bf253a33411d3587447185e"} Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.211178 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4adae7d1036ec94518141a80262f0ccc568fec13bf253a33411d3587447185e" Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.222741 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8" exitCode=0 Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.222904 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8"} Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.223067 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bf4c887f-mdlxm"] Mar 18 15:38:31 crc kubenswrapper[4792]: I0318 15:38:31.223103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"e556db138e974707e7c40c2b1e588b4b2e220ef714206286fc58191fa3277348"} Mar 18 15:38:31 crc kubenswrapper[4792]: W0318 15:38:31.232130 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4b82531_c864_4b5c_9a5f_59170ddb5c7c.slice/crio-4333338f59a440c93b07e54a51c581b2c8810312b0e8486d76498319b62cfcfc WatchSource:0}: Error finding container 4333338f59a440c93b07e54a51c581b2c8810312b0e8486d76498319b62cfcfc: Status 404 returned error can't find the container with id 4333338f59a440c93b07e54a51c581b2c8810312b0e8486d76498319b62cfcfc Mar 18 15:38:32 crc kubenswrapper[4792]: I0318 15:38:32.229197 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" event={"ID":"d4b82531-c864-4b5c-9a5f-59170ddb5c7c","Type":"ContainerStarted","Data":"2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60"} Mar 18 15:38:32 crc kubenswrapper[4792]: I0318 15:38:32.229482 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:32 crc kubenswrapper[4792]: I0318 15:38:32.229518 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" event={"ID":"d4b82531-c864-4b5c-9a5f-59170ddb5c7c","Type":"ContainerStarted","Data":"4333338f59a440c93b07e54a51c581b2c8810312b0e8486d76498319b62cfcfc"} Mar 18 15:38:32 crc kubenswrapper[4792]: I0318 15:38:32.233912 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:38:32 crc kubenswrapper[4792]: I0318 15:38:32.248036 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" podStartSLOduration=13.247997713 podStartE2EDuration="13.247997713s" podCreationTimestamp="2026-03-18 15:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:38:32.244318658 +0000 UTC m=+261.113647585" watchObservedRunningTime="2026-03-18 15:38:32.247997713 +0000 UTC m=+261.117326650" Mar 18 15:38:34 crc kubenswrapper[4792]: I0318 15:38:34.242010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrw8" event={"ID":"8ba56c2f-0c1b-4201-9960-590b2cb73fb6","Type":"ContainerStarted","Data":"ac70cc95e56f07cc16beadff86603fdcfa363a0ac63315a4468c3dbdafad2941"} Mar 18 15:38:35 crc kubenswrapper[4792]: I0318 15:38:35.248586 4792 generic.go:334] "Generic (PLEG): container finished" podID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerID="ac70cc95e56f07cc16beadff86603fdcfa363a0ac63315a4468c3dbdafad2941" exitCode=0 Mar 18 15:38:35 crc kubenswrapper[4792]: I0318 15:38:35.248659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrw8" event={"ID":"8ba56c2f-0c1b-4201-9960-590b2cb73fb6","Type":"ContainerDied","Data":"ac70cc95e56f07cc16beadff86603fdcfa363a0ac63315a4468c3dbdafad2941"} Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.314147 4792 generic.go:334] "Generic (PLEG): container finished" podID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerID="54bd7a9fc5f42cc9ed4bf62a38db1724e9e652f776fac91e8691d17021b70917" exitCode=0 Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.314213 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j66mt" event={"ID":"b891667e-d9ed-4602-8ef2-b0461e32a955","Type":"ContainerDied","Data":"54bd7a9fc5f42cc9ed4bf62a38db1724e9e652f776fac91e8691d17021b70917"} Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.318024 4792 generic.go:334] "Generic (PLEG): container finished" podID="de3a8b86-9557-46d6-b146-bc402161292c" containerID="70b4c191d45098c97322d0ce6969cd1a2a5e887c353bff69d794c54e2167da52" exitCode=0 Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.318074 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6pb" event={"ID":"de3a8b86-9557-46d6-b146-bc402161292c","Type":"ContainerDied","Data":"70b4c191d45098c97322d0ce6969cd1a2a5e887c353bff69d794c54e2167da52"} Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.319795 4792 generic.go:334] "Generic (PLEG): container finished" podID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerID="0bfc3e0edc8ee5ffd0158c6d1bbfe9471fc06d26b11a134382e90c9fed39c4d9" exitCode=0 Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.319815 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nztp5" event={"ID":"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6","Type":"ContainerDied","Data":"0bfc3e0edc8ee5ffd0158c6d1bbfe9471fc06d26b11a134382e90c9fed39c4d9"} Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.321607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrw8" event={"ID":"8ba56c2f-0c1b-4201-9960-590b2cb73fb6","Type":"ContainerStarted","Data":"0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8"} Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.323151 4792 generic.go:334] "Generic (PLEG): container finished" podID="9190b8d3-2c0b-4345-9390-21511bad8701" containerID="2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80" exitCode=0 Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.323220 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4lw5" event={"ID":"9190b8d3-2c0b-4345-9390-21511bad8701","Type":"ContainerDied","Data":"2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80"} Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.324755 4792 generic.go:334] "Generic (PLEG): container finished" podID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerID="92a60a9d2487e131e5094e0224eb3149859b2ca06c9322cb55283ebad2af2c86" exitCode=0 Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.324781 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4ws8" event={"ID":"0bd261d3-47ac-4082-bc3b-128b9b72df06","Type":"ContainerDied","Data":"92a60a9d2487e131e5094e0224eb3149859b2ca06c9322cb55283ebad2af2c86"} Mar 18 15:38:42 crc kubenswrapper[4792]: I0318 15:38:42.413524 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jqrw8" podStartSLOduration=2.9257550180000003 podStartE2EDuration="58.413505896s" podCreationTimestamp="2026-03-18 15:37:44 +0000 UTC" firstStartedPulling="2026-03-18 15:37:45.585256149 +0000 UTC m=+214.454585086" lastFinishedPulling="2026-03-18 15:38:41.073007027 +0000 UTC m=+269.942335964" observedRunningTime="2026-03-18 15:38:42.410774519 +0000 UTC m=+271.280103476" watchObservedRunningTime="2026-03-18 15:38:42.413505896 +0000 UTC m=+271.282834853" Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.344161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j66mt" event={"ID":"b891667e-d9ed-4602-8ef2-b0461e32a955","Type":"ContainerStarted","Data":"028a4922a10beb204c15f0348a606d3831a61fd242759de68259eb9a733bb9cf"} Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.346027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6pb" event={"ID":"de3a8b86-9557-46d6-b146-bc402161292c","Type":"ContainerStarted","Data":"0ef141ef393f3295f1a045ade6b883541629b48a921c8e69f10ca34860c4e412"} Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.348897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nztp5" event={"ID":"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6","Type":"ContainerStarted","Data":"6d3a398b05dbb050f7878eb54da664aba06876caf1436dcd8ade69cfb089bc17"} Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.350847 4792 generic.go:334] "Generic (PLEG): container finished" podID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerID="3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63" exitCode=0 Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.350924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps7zj" event={"ID":"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f","Type":"ContainerDied","Data":"3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63"} Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.353129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4lw5" event={"ID":"9190b8d3-2c0b-4345-9390-21511bad8701","Type":"ContainerStarted","Data":"3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a"} Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.356768 4792 generic.go:334] "Generic (PLEG): container finished" podID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerID="2ae787d66521464ff5636c7c87fccd4928a1a191fceef601a593e45d77bde2b4" exitCode=0 Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.356824 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c58jv" event={"ID":"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2","Type":"ContainerDied","Data":"2ae787d66521464ff5636c7c87fccd4928a1a191fceef601a593e45d77bde2b4"} Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.361373 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4ws8" event={"ID":"0bd261d3-47ac-4082-bc3b-128b9b72df06","Type":"ContainerStarted","Data":"6736d6231e689644fa4fd79eed0e0ab00ad75be0f757dd115c7243a0a1d411e3"} Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.368422 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j66mt" podStartSLOduration=3.506338215 podStartE2EDuration="1m0.368402422s" podCreationTimestamp="2026-03-18 15:37:44 +0000 UTC" firstStartedPulling="2026-03-18 15:37:46.655084193 +0000 UTC m=+215.524413130" lastFinishedPulling="2026-03-18 15:38:43.51714841 +0000 UTC m=+272.386477337" observedRunningTime="2026-03-18 15:38:44.362707286 +0000 UTC m=+273.232036223" watchObservedRunningTime="2026-03-18 15:38:44.368402422 +0000 UTC m=+273.237731359" Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.393986 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nztp5" podStartSLOduration=2.759085562 podStartE2EDuration="58.393946836s" podCreationTimestamp="2026-03-18 15:37:46 +0000 UTC" firstStartedPulling="2026-03-18 15:37:47.704781509 +0000 UTC m=+216.574110446" lastFinishedPulling="2026-03-18 15:38:43.339642773 +0000 UTC m=+272.208971720" observedRunningTime="2026-03-18 15:38:44.38874209 +0000 UTC m=+273.258071027" watchObservedRunningTime="2026-03-18 15:38:44.393946836 +0000 UTC m=+273.263275773" Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.406122 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b4ws8" podStartSLOduration=2.5089483230000003 podStartE2EDuration="59.406100675s" podCreationTimestamp="2026-03-18 15:37:45 +0000 UTC" firstStartedPulling="2026-03-18 15:37:46.70159607 +0000 UTC m=+215.570924997" lastFinishedPulling="2026-03-18 15:38:43.598748412 +0000 UTC m=+272.468077349" observedRunningTime="2026-03-18 15:38:44.402823238 +0000 UTC m=+273.272152165" watchObservedRunningTime="2026-03-18 15:38:44.406100675 +0000 UTC m=+273.275429622" Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.420269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v4lw5" podStartSLOduration=3.720921489 podStartE2EDuration="58.420249235s" podCreationTimestamp="2026-03-18 15:37:46 +0000 UTC" firstStartedPulling="2026-03-18 15:37:48.76879723 +0000 UTC m=+217.638126167" lastFinishedPulling="2026-03-18 15:38:43.468124976 +0000 UTC m=+272.337453913" observedRunningTime="2026-03-18 15:38:44.418626011 +0000 UTC m=+273.287954958" watchObservedRunningTime="2026-03-18 15:38:44.420249235 +0000 UTC m=+273.289578182" Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.439277 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fd6pb" podStartSLOduration=10.581188468 podStartE2EDuration="56.439260695s" podCreationTimestamp="2026-03-18 15:37:48 +0000 UTC" firstStartedPulling="2026-03-18 15:37:57.30998314 +0000 UTC m=+226.179312087" lastFinishedPulling="2026-03-18 15:38:43.168055387 +0000 UTC m=+272.037384314" observedRunningTime="2026-03-18 15:38:44.43857606 +0000 UTC m=+273.307904997" watchObservedRunningTime="2026-03-18 15:38:44.439260695 +0000 UTC m=+273.308589642" Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.552707 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:38:44 crc kubenswrapper[4792]: I0318 15:38:44.552751 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:38:45 crc kubenswrapper[4792]: I0318 15:38:45.262572 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:38:45 crc kubenswrapper[4792]: I0318 15:38:45.262629 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:38:45 crc kubenswrapper[4792]: I0318 15:38:45.379044 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:38:45 crc kubenswrapper[4792]: I0318 15:38:45.379128 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:38:46 crc kubenswrapper[4792]: I0318 15:38:46.012447 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jqrw8" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="registry-server" probeResult="failure" output=< Mar 18 15:38:46 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 15:38:46 crc kubenswrapper[4792]: > Mar 18 15:38:46 crc kubenswrapper[4792]: I0318 15:38:46.311021 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-j66mt" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="registry-server" probeResult="failure" output=< Mar 18 15:38:46 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 15:38:46 crc kubenswrapper[4792]: > Mar 18 15:38:46 crc kubenswrapper[4792]: I0318 15:38:46.422503 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-b4ws8" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="registry-server" probeResult="failure" output=< Mar 18 15:38:46 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 15:38:46 crc kubenswrapper[4792]: > Mar 18 15:38:46 crc kubenswrapper[4792]: I0318 15:38:46.697837 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svr4m"] Mar 18 15:38:46 crc kubenswrapper[4792]: I0318 15:38:46.756524 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:38:46 crc kubenswrapper[4792]: I0318 15:38:46.756572 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:38:46 crc kubenswrapper[4792]: I0318 15:38:46.877057 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:38:47 crc kubenswrapper[4792]: I0318 15:38:47.168310 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:38:47 crc kubenswrapper[4792]: I0318 15:38:47.168385 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:38:47 crc kubenswrapper[4792]: I0318 15:38:47.215506 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:38:48 crc kubenswrapper[4792]: I0318 15:38:48.345893 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:38:48 crc kubenswrapper[4792]: I0318 15:38:48.345992 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:38:49 crc kubenswrapper[4792]: I0318 15:38:49.381358 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fd6pb" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="registry-server" probeResult="failure" output=< Mar 18 15:38:49 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 15:38:49 crc kubenswrapper[4792]: > Mar 18 15:38:52 crc kubenswrapper[4792]: I0318 15:38:52.408058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps7zj" event={"ID":"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f","Type":"ContainerStarted","Data":"a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3"} Mar 18 15:38:52 crc kubenswrapper[4792]: I0318 15:38:52.410037 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c58jv" event={"ID":"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2","Type":"ContainerStarted","Data":"d2a5963051d4f363a93bda67ae80995f82f4b245b7ab6f9901641ec6b4746663"} Mar 18 15:38:52 crc kubenswrapper[4792]: I0318 15:38:52.432696 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ps7zj" podStartSLOduration=3.957222253 podStartE2EDuration="1m8.432680966s" podCreationTimestamp="2026-03-18 15:37:44 +0000 UTC" firstStartedPulling="2026-03-18 15:37:46.682041943 +0000 UTC m=+215.551370880" lastFinishedPulling="2026-03-18 15:38:51.157500656 +0000 UTC m=+280.026829593" observedRunningTime="2026-03-18 15:38:52.429200344 +0000 UTC m=+281.298529301" watchObservedRunningTime="2026-03-18 15:38:52.432680966 +0000 UTC m=+281.302009903" Mar 18 15:38:52 crc kubenswrapper[4792]: I0318 15:38:52.449084 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c58jv" podStartSLOduration=11.755268995 podStartE2EDuration="1m5.449066351s" podCreationTimestamp="2026-03-18 15:37:47 +0000 UTC" firstStartedPulling="2026-03-18 15:37:57.3099819 +0000 UTC m=+226.179310847" lastFinishedPulling="2026-03-18 15:38:51.003779266 +0000 UTC m=+279.873108203" observedRunningTime="2026-03-18 15:38:52.448156112 +0000 UTC m=+281.317485059" watchObservedRunningTime="2026-03-18 15:38:52.449066351 +0000 UTC m=+281.318395288" Mar 18 15:38:54 crc kubenswrapper[4792]: I0318 15:38:54.588464 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:38:54 crc kubenswrapper[4792]: I0318 15:38:54.632085 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:38:54 crc kubenswrapper[4792]: I0318 15:38:54.975729 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:38:54 crc kubenswrapper[4792]: I0318 15:38:54.976453 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:38:55 crc kubenswrapper[4792]: I0318 15:38:55.021224 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:38:55 crc kubenswrapper[4792]: I0318 15:38:55.305852 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:38:55 crc kubenswrapper[4792]: I0318 15:38:55.347889 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:38:55 crc kubenswrapper[4792]: I0318 15:38:55.443658 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:38:55 crc kubenswrapper[4792]: I0318 15:38:55.494649 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:38:56 crc kubenswrapper[4792]: I0318 15:38:56.497730 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:38:56 crc kubenswrapper[4792]: I0318 15:38:56.795016 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:38:57 crc kubenswrapper[4792]: I0318 15:38:57.205840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:38:57 crc kubenswrapper[4792]: I0318 15:38:57.552077 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b4ws8"] Mar 18 15:38:57 crc kubenswrapper[4792]: I0318 15:38:57.552410 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b4ws8" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="registry-server" containerID="cri-o://6736d6231e689644fa4fd79eed0e0ab00ad75be0f757dd115c7243a0a1d411e3" gracePeriod=2 Mar 18 15:38:57 crc kubenswrapper[4792]: I0318 15:38:57.754376 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps7zj"] Mar 18 15:38:57 crc kubenswrapper[4792]: I0318 15:38:57.956168 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:38:57 crc kubenswrapper[4792]: I0318 15:38:57.956522 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:38:57 crc kubenswrapper[4792]: I0318 15:38:57.996734 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.383491 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.438086 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.445693 4792 generic.go:334] "Generic (PLEG): container finished" podID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerID="6736d6231e689644fa4fd79eed0e0ab00ad75be0f757dd115c7243a0a1d411e3" exitCode=0 Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.445799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4ws8" event={"ID":"0bd261d3-47ac-4082-bc3b-128b9b72df06","Type":"ContainerDied","Data":"6736d6231e689644fa4fd79eed0e0ab00ad75be0f757dd115c7243a0a1d411e3"} Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.446360 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ps7zj" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="registry-server" containerID="cri-o://a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3" gracePeriod=2 Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.481579 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.594321 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.712500 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-utilities\") pod \"0bd261d3-47ac-4082-bc3b-128b9b72df06\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.712594 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76n4n\" (UniqueName: \"kubernetes.io/projected/0bd261d3-47ac-4082-bc3b-128b9b72df06-kube-api-access-76n4n\") pod \"0bd261d3-47ac-4082-bc3b-128b9b72df06\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.712645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-catalog-content\") pod \"0bd261d3-47ac-4082-bc3b-128b9b72df06\" (UID: \"0bd261d3-47ac-4082-bc3b-128b9b72df06\") " Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.714878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-utilities" (OuterVolumeSpecName: "utilities") pod "0bd261d3-47ac-4082-bc3b-128b9b72df06" (UID: "0bd261d3-47ac-4082-bc3b-128b9b72df06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.724150 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd261d3-47ac-4082-bc3b-128b9b72df06-kube-api-access-76n4n" (OuterVolumeSpecName: "kube-api-access-76n4n") pod "0bd261d3-47ac-4082-bc3b-128b9b72df06" (UID: "0bd261d3-47ac-4082-bc3b-128b9b72df06"). InnerVolumeSpecName "kube-api-access-76n4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.761610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bd261d3-47ac-4082-bc3b-128b9b72df06" (UID: "0bd261d3-47ac-4082-bc3b-128b9b72df06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.823091 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.823195 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76n4n\" (UniqueName: \"kubernetes.io/projected/0bd261d3-47ac-4082-bc3b-128b9b72df06-kube-api-access-76n4n\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.823209 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd261d3-47ac-4082-bc3b-128b9b72df06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:58 crc kubenswrapper[4792]: I0318 15:38:58.936315 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.034257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-kube-api-access-9ssjw\") pod \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.034374 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-utilities\") pod \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.034446 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-catalog-content\") pod \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\" (UID: \"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f\") " Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.035843 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-utilities" (OuterVolumeSpecName: "utilities") pod "2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" (UID: "2ab5703e-d7a2-4839-b0d5-2851a53b5f6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.036889 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-kube-api-access-9ssjw" (OuterVolumeSpecName: "kube-api-access-9ssjw") pod "2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" (UID: "2ab5703e-d7a2-4839-b0d5-2851a53b5f6f"). InnerVolumeSpecName "kube-api-access-9ssjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.086566 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" (UID: "2ab5703e-d7a2-4839-b0d5-2851a53b5f6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.135476 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.135530 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-kube-api-access-9ssjw\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.135546 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.452929 4792 generic.go:334] "Generic (PLEG): container finished" podID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerID="a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3" exitCode=0 Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.452995 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps7zj" event={"ID":"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f","Type":"ContainerDied","Data":"a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3"} Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.453038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps7zj" event={"ID":"2ab5703e-d7a2-4839-b0d5-2851a53b5f6f","Type":"ContainerDied","Data":"12e26abc1610172dbf0c43f6102943563ec97301f573e9f63c467d71050823bb"} Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.453061 4792 scope.go:117] "RemoveContainer" containerID="a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.453061 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps7zj" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.455553 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4ws8" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.455590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4ws8" event={"ID":"0bd261d3-47ac-4082-bc3b-128b9b72df06","Type":"ContainerDied","Data":"98488a92705e850367393dd46943e8b59a23936a5d80a71079a12d93f45b8054"} Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.469646 4792 scope.go:117] "RemoveContainer" containerID="3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.490085 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps7zj"] Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.493999 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ps7zj"] Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.497602 4792 scope.go:117] "RemoveContainer" containerID="543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.502792 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b4ws8"] Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.509480 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b4ws8"] Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.524657 4792 scope.go:117] "RemoveContainer" containerID="a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3" Mar 18 15:38:59 crc kubenswrapper[4792]: E0318 15:38:59.525302 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3\": container with ID starting with a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3 not found: ID does not exist" containerID="a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.525350 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3"} err="failed to get container status \"a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3\": rpc error: code = NotFound desc = could not find container \"a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3\": container with ID starting with a453abb64c18a02b7e305720b3c34c3f8c98b443126a575347b6bba3f0c228a3 not found: ID does not exist" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.525382 4792 scope.go:117] "RemoveContainer" containerID="3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63" Mar 18 15:38:59 crc kubenswrapper[4792]: E0318 15:38:59.525792 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63\": container with ID starting with 3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63 not found: ID does not exist" containerID="3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.525832 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63"} err="failed to get container status \"3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63\": rpc error: code = NotFound desc = could not find container \"3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63\": container with ID starting with 3250fabf2d2fb7ba646876071dd51a5705047779cbb8cbe5f8faa9b63413aa63 not found: ID does not exist" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.525860 4792 scope.go:117] "RemoveContainer" containerID="543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899" Mar 18 15:38:59 crc kubenswrapper[4792]: E0318 15:38:59.526168 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899\": container with ID starting with 543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899 not found: ID does not exist" containerID="543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.526202 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899"} err="failed to get container status \"543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899\": rpc error: code = NotFound desc = could not find container \"543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899\": container with ID starting with 543b1cad2262d65cfbd8ba63d755365dffbbdcd91dc751e09aee1904fce80899 not found: ID does not exist" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.526245 4792 scope.go:117] "RemoveContainer" containerID="6736d6231e689644fa4fd79eed0e0ab00ad75be0f757dd115c7243a0a1d411e3" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.557099 4792 scope.go:117] "RemoveContainer" containerID="92a60a9d2487e131e5094e0224eb3149859b2ca06c9322cb55283ebad2af2c86" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.576204 4792 scope.go:117] "RemoveContainer" containerID="c4515a464551556860e70804dbbaf8091ad56d94a555c8bfadaaf4d7dbea10d2" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.840025 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bf4c887f-mdlxm"] Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.840225 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" podUID="d4b82531-c864-4b5c-9a5f-59170ddb5c7c" containerName="controller-manager" containerID="cri-o://2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60" gracePeriod=30 Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.862456 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" path="/var/lib/kubelet/pods/0bd261d3-47ac-4082-bc3b-128b9b72df06/volumes" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.863095 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" path="/var/lib/kubelet/pods/2ab5703e-d7a2-4839-b0d5-2851a53b5f6f/volumes" Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.937688 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr"] Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.937949 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" podUID="f7e0560a-ec5f-4c71-bec8-c69aa8321608" containerName="route-controller-manager" containerID="cri-o://8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60" gracePeriod=30 Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.958284 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4lw5"] Mar 18 15:38:59 crc kubenswrapper[4792]: I0318 15:38:59.958713 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v4lw5" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="registry-server" containerID="cri-o://3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a" gracePeriod=2 Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.432594 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.443715 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.448192 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.461978 4792 generic.go:334] "Generic (PLEG): container finished" podID="f7e0560a-ec5f-4c71-bec8-c69aa8321608" containerID="8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60" exitCode=0 Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.462075 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.462083 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" event={"ID":"f7e0560a-ec5f-4c71-bec8-c69aa8321608","Type":"ContainerDied","Data":"8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60"} Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.462223 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr" event={"ID":"f7e0560a-ec5f-4c71-bec8-c69aa8321608","Type":"ContainerDied","Data":"c69548edbbe31d1b4829bc695d1b6cc292c8141b666fc796e80a44428509d221"} Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.462256 4792 scope.go:117] "RemoveContainer" containerID="8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.465700 4792 generic.go:334] "Generic (PLEG): container finished" podID="9190b8d3-2c0b-4345-9390-21511bad8701" containerID="3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a" exitCode=0 Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.465750 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4lw5" event={"ID":"9190b8d3-2c0b-4345-9390-21511bad8701","Type":"ContainerDied","Data":"3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a"} Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.465770 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v4lw5" event={"ID":"9190b8d3-2c0b-4345-9390-21511bad8701","Type":"ContainerDied","Data":"395ec634a6b183205d145dbf7ef68691b994f10fd88a9e55ea28d56282ba6ef2"} Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.465830 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v4lw5" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.467084 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4b82531-c864-4b5c-9a5f-59170ddb5c7c" containerID="2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60" exitCode=0 Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.467126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" event={"ID":"d4b82531-c864-4b5c-9a5f-59170ddb5c7c","Type":"ContainerDied","Data":"2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60"} Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.467143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" event={"ID":"d4b82531-c864-4b5c-9a5f-59170ddb5c7c","Type":"ContainerDied","Data":"4333338f59a440c93b07e54a51c581b2c8810312b0e8486d76498319b62cfcfc"} Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.467179 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bf4c887f-mdlxm" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.484549 4792 scope.go:117] "RemoveContainer" containerID="8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60" Mar 18 15:39:00 crc kubenswrapper[4792]: E0318 15:39:00.485487 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60\": container with ID starting with 8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60 not found: ID does not exist" containerID="8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.485535 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60"} err="failed to get container status \"8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60\": rpc error: code = NotFound desc = could not find container \"8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60\": container with ID starting with 8ffee9d959ac704c15468499a273af3f7414b6870f8bce03e50a9739b39aba60 not found: ID does not exist" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.485599 4792 scope.go:117] "RemoveContainer" containerID="3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.503827 4792 scope.go:117] "RemoveContainer" containerID="2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.517333 4792 scope.go:117] "RemoveContainer" containerID="44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.534074 4792 scope.go:117] "RemoveContainer" containerID="3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a" Mar 18 15:39:00 crc kubenswrapper[4792]: E0318 15:39:00.534586 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a\": container with ID starting with 3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a not found: ID does not exist" containerID="3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.534626 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a"} err="failed to get container status \"3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a\": rpc error: code = NotFound desc = could not find container \"3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a\": container with ID starting with 3f5d6f6b7a53c03a483c29d09486841bbf79a9b41d6b83138547e9dc8831066a not found: ID does not exist" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.534655 4792 scope.go:117] "RemoveContainer" containerID="2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80" Mar 18 15:39:00 crc kubenswrapper[4792]: E0318 15:39:00.534951 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80\": container with ID starting with 2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80 not found: ID does not exist" containerID="2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.534991 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80"} err="failed to get container status \"2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80\": rpc error: code = NotFound desc = could not find container \"2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80\": container with ID starting with 2bdd7e6695902055d90d5b456331ebbb5e31df66cd1a5d41d3ed2aa36597cd80 not found: ID does not exist" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.535018 4792 scope.go:117] "RemoveContainer" containerID="44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7" Mar 18 15:39:00 crc kubenswrapper[4792]: E0318 15:39:00.535314 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7\": container with ID starting with 44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7 not found: ID does not exist" containerID="44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.535357 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7"} err="failed to get container status \"44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7\": rpc error: code = NotFound desc = could not find container \"44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7\": container with ID starting with 44bbb8ae8b43cf544f0643f13d6b451a2d2dfddafd7fb4d4be6723698ed93bf7 not found: ID does not exist" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.535385 4792 scope.go:117] "RemoveContainer" containerID="2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.552157 4792 scope.go:117] "RemoveContainer" containerID="2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60" Mar 18 15:39:00 crc kubenswrapper[4792]: E0318 15:39:00.552658 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60\": container with ID starting with 2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60 not found: ID does not exist" containerID="2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.552708 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60"} err="failed to get container status \"2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60\": rpc error: code = NotFound desc = could not find container \"2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60\": container with ID starting with 2ea2f34e9ca3b5846bf1c496ac8d354270d79ff49fa19aca0e89c062a1d49f60 not found: ID does not exist" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554433 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-proxy-ca-bundles\") pod \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554505 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-catalog-content\") pod \"9190b8d3-2c0b-4345-9390-21511bad8701\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7e0560a-ec5f-4c71-bec8-c69aa8321608-serving-cert\") pod \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw89p\" (UniqueName: \"kubernetes.io/projected/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-kube-api-access-hw89p\") pod \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554803 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-config\") pod \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554828 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-config\") pod \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-utilities\") pod \"9190b8d3-2c0b-4345-9390-21511bad8701\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84rjk\" (UniqueName: \"kubernetes.io/projected/f7e0560a-ec5f-4c71-bec8-c69aa8321608-kube-api-access-84rjk\") pod \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554886 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-serving-cert\") pod \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554906 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btc9s\" (UniqueName: \"kubernetes.io/projected/9190b8d3-2c0b-4345-9390-21511bad8701-kube-api-access-btc9s\") pod \"9190b8d3-2c0b-4345-9390-21511bad8701\" (UID: \"9190b8d3-2c0b-4345-9390-21511bad8701\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554922 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-client-ca\") pod \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\" (UID: \"f7e0560a-ec5f-4c71-bec8-c69aa8321608\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.554938 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-client-ca\") pod \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\" (UID: \"d4b82531-c864-4b5c-9a5f-59170ddb5c7c\") " Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.555507 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d4b82531-c864-4b5c-9a5f-59170ddb5c7c" (UID: "d4b82531-c864-4b5c-9a5f-59170ddb5c7c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.555633 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-utilities" (OuterVolumeSpecName: "utilities") pod "9190b8d3-2c0b-4345-9390-21511bad8701" (UID: "9190b8d3-2c0b-4345-9390-21511bad8701"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.555735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-config" (OuterVolumeSpecName: "config") pod "d4b82531-c864-4b5c-9a5f-59170ddb5c7c" (UID: "d4b82531-c864-4b5c-9a5f-59170ddb5c7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.555961 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4b82531-c864-4b5c-9a5f-59170ddb5c7c" (UID: "d4b82531-c864-4b5c-9a5f-59170ddb5c7c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.558606 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-client-ca" (OuterVolumeSpecName: "client-ca") pod "f7e0560a-ec5f-4c71-bec8-c69aa8321608" (UID: "f7e0560a-ec5f-4c71-bec8-c69aa8321608"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.558701 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-config" (OuterVolumeSpecName: "config") pod "f7e0560a-ec5f-4c71-bec8-c69aa8321608" (UID: "f7e0560a-ec5f-4c71-bec8-c69aa8321608"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.561525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e0560a-ec5f-4c71-bec8-c69aa8321608-kube-api-access-84rjk" (OuterVolumeSpecName: "kube-api-access-84rjk") pod "f7e0560a-ec5f-4c71-bec8-c69aa8321608" (UID: "f7e0560a-ec5f-4c71-bec8-c69aa8321608"). InnerVolumeSpecName "kube-api-access-84rjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.561565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-kube-api-access-hw89p" (OuterVolumeSpecName: "kube-api-access-hw89p") pod "d4b82531-c864-4b5c-9a5f-59170ddb5c7c" (UID: "d4b82531-c864-4b5c-9a5f-59170ddb5c7c"). InnerVolumeSpecName "kube-api-access-hw89p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.561614 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9190b8d3-2c0b-4345-9390-21511bad8701-kube-api-access-btc9s" (OuterVolumeSpecName: "kube-api-access-btc9s") pod "9190b8d3-2c0b-4345-9390-21511bad8701" (UID: "9190b8d3-2c0b-4345-9390-21511bad8701"). InnerVolumeSpecName "kube-api-access-btc9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.561562 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4b82531-c864-4b5c-9a5f-59170ddb5c7c" (UID: "d4b82531-c864-4b5c-9a5f-59170ddb5c7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.561748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e0560a-ec5f-4c71-bec8-c69aa8321608-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f7e0560a-ec5f-4c71-bec8-c69aa8321608" (UID: "f7e0560a-ec5f-4c71-bec8-c69aa8321608"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.587829 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9190b8d3-2c0b-4345-9390-21511bad8701" (UID: "9190b8d3-2c0b-4345-9390-21511bad8701"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.655912 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.655955 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7e0560a-ec5f-4c71-bec8-c69aa8321608-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656026 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw89p\" (UniqueName: \"kubernetes.io/projected/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-kube-api-access-hw89p\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656045 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656058 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656081 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9190b8d3-2c0b-4345-9390-21511bad8701-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656094 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84rjk\" (UniqueName: \"kubernetes.io/projected/f7e0560a-ec5f-4c71-bec8-c69aa8321608-kube-api-access-84rjk\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656107 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btc9s\" (UniqueName: \"kubernetes.io/projected/9190b8d3-2c0b-4345-9390-21511bad8701-kube-api-access-btc9s\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656118 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656129 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656139 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e0560a-ec5f-4c71-bec8-c69aa8321608-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.656152 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b82531-c864-4b5c-9a5f-59170ddb5c7c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.792790 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr"] Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.795640 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8844c8d8f-hf8mr"] Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.803727 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4lw5"] Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.807054 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v4lw5"] Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.815830 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bf4c887f-mdlxm"] Mar 18 15:39:00 crc kubenswrapper[4792]: I0318 15:39:00.824547 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bf4c887f-mdlxm"] Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494423 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv"] Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494657 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="extract-content" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494674 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="extract-content" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494687 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="extract-utilities" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494695 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="extract-utilities" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494709 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494718 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494728 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="extract-utilities" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494736 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="extract-utilities" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494747 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="extract-utilities" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494754 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="extract-utilities" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494764 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494771 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494798 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494803 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494811 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b82531-c864-4b5c-9a5f-59170ddb5c7c" containerName="controller-manager" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494816 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b82531-c864-4b5c-9a5f-59170ddb5c7c" containerName="controller-manager" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494825 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="extract-content" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494831 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="extract-content" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494841 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="extract-content" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494847 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="extract-content" Mar 18 15:39:01 crc kubenswrapper[4792]: E0318 15:39:01.494856 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e0560a-ec5f-4c71-bec8-c69aa8321608" containerName="route-controller-manager" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494862 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e0560a-ec5f-4c71-bec8-c69aa8321608" containerName="route-controller-manager" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494946 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab5703e-d7a2-4839-b0d5-2851a53b5f6f" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494959 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd261d3-47ac-4082-bc3b-128b9b72df06" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494988 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" containerName="registry-server" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.494994 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e0560a-ec5f-4c71-bec8-c69aa8321608" containerName="route-controller-manager" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.495003 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b82531-c864-4b5c-9a5f-59170ddb5c7c" containerName="controller-manager" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.495343 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.499741 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.500482 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.501741 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.501928 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.502146 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.502155 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.503069 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-558645b857-5vrs9"] Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.504438 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.507586 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.507886 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.507945 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.507886 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.508352 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.508898 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.520757 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.534761 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv"] Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.537897 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558645b857-5vrs9"] Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.567995 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-client-ca\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568103 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-serving-cert\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568133 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwqjk\" (UniqueName: \"kubernetes.io/projected/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-kube-api-access-gwqjk\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/473ce97d-c7f0-4ec4-8530-c46179d81c30-serving-cert\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-proxy-ca-bundles\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568204 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-config\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568230 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247ks\" (UniqueName: \"kubernetes.io/projected/473ce97d-c7f0-4ec4-8530-c46179d81c30-kube-api-access-247ks\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568283 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-config\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.568326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-client-ca\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.669269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-config\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.669608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-client-ca\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.669750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-client-ca\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.669876 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-serving-cert\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.670015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwqjk\" (UniqueName: \"kubernetes.io/projected/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-kube-api-access-gwqjk\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.670160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/473ce97d-c7f0-4ec4-8530-c46179d81c30-serving-cert\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.670696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-proxy-ca-bundles\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.670793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-config\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.670857 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-client-ca\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.670642 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-client-ca\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.670786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-config\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.671121 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247ks\" (UniqueName: \"kubernetes.io/projected/473ce97d-c7f0-4ec4-8530-c46179d81c30-kube-api-access-247ks\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.671533 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-proxy-ca-bundles\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.671793 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-config\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.674487 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-serving-cert\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.676597 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/473ce97d-c7f0-4ec4-8530-c46179d81c30-serving-cert\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.685624 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwqjk\" (UniqueName: \"kubernetes.io/projected/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-kube-api-access-gwqjk\") pod \"controller-manager-558645b857-5vrs9\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.686007 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247ks\" (UniqueName: \"kubernetes.io/projected/473ce97d-c7f0-4ec4-8530-c46179d81c30-kube-api-access-247ks\") pod \"route-controller-manager-68dcf467c5-7kvkv\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.810169 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.825232 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.862390 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9190b8d3-2c0b-4345-9390-21511bad8701" path="/var/lib/kubelet/pods/9190b8d3-2c0b-4345-9390-21511bad8701/volumes" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.863082 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b82531-c864-4b5c-9a5f-59170ddb5c7c" path="/var/lib/kubelet/pods/d4b82531-c864-4b5c-9a5f-59170ddb5c7c/volumes" Mar 18 15:39:01 crc kubenswrapper[4792]: I0318 15:39:01.863846 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e0560a-ec5f-4c71-bec8-c69aa8321608" path="/var/lib/kubelet/pods/f7e0560a-ec5f-4c71-bec8-c69aa8321608/volumes" Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.044511 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv"] Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.323375 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558645b857-5vrs9"] Mar 18 15:39:02 crc kubenswrapper[4792]: W0318 15:39:02.330380 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97bf7ace_de36_42fc_aca7_8f9fe2bed94c.slice/crio-22bfa5b3c8db90878eda0e2d5692c83a8ef8b9233d90b373b1f1bf8a94de0a8c WatchSource:0}: Error finding container 22bfa5b3c8db90878eda0e2d5692c83a8ef8b9233d90b373b1f1bf8a94de0a8c: Status 404 returned error can't find the container with id 22bfa5b3c8db90878eda0e2d5692c83a8ef8b9233d90b373b1f1bf8a94de0a8c Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.349205 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fd6pb"] Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.349505 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fd6pb" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="registry-server" containerID="cri-o://0ef141ef393f3295f1a045ade6b883541629b48a921c8e69f10ca34860c4e412" gracePeriod=2 Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.482763 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" event={"ID":"97bf7ace-de36-42fc-aca7-8f9fe2bed94c","Type":"ContainerStarted","Data":"22bfa5b3c8db90878eda0e2d5692c83a8ef8b9233d90b373b1f1bf8a94de0a8c"} Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.485201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" event={"ID":"473ce97d-c7f0-4ec4-8530-c46179d81c30","Type":"ContainerStarted","Data":"e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8"} Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.485235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" event={"ID":"473ce97d-c7f0-4ec4-8530-c46179d81c30","Type":"ContainerStarted","Data":"dd1cd1fcb17cf50cbc4cad18488ceade7fccd451eda9541ace6dbb5408b4252a"} Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.485526 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.494114 4792 generic.go:334] "Generic (PLEG): container finished" podID="de3a8b86-9557-46d6-b146-bc402161292c" containerID="0ef141ef393f3295f1a045ade6b883541629b48a921c8e69f10ca34860c4e412" exitCode=0 Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.494167 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6pb" event={"ID":"de3a8b86-9557-46d6-b146-bc402161292c","Type":"ContainerDied","Data":"0ef141ef393f3295f1a045ade6b883541629b48a921c8e69f10ca34860c4e412"} Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.714105 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.735083 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" podStartSLOduration=3.7350618190000002 podStartE2EDuration="3.735061819s" podCreationTimestamp="2026-03-18 15:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:02.514169053 +0000 UTC m=+291.383497990" watchObservedRunningTime="2026-03-18 15:39:02.735061819 +0000 UTC m=+291.604390766" Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.888254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx9ph\" (UniqueName: \"kubernetes.io/projected/de3a8b86-9557-46d6-b146-bc402161292c-kube-api-access-bx9ph\") pod \"de3a8b86-9557-46d6-b146-bc402161292c\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.888496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-catalog-content\") pod \"de3a8b86-9557-46d6-b146-bc402161292c\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.888543 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-utilities\") pod \"de3a8b86-9557-46d6-b146-bc402161292c\" (UID: \"de3a8b86-9557-46d6-b146-bc402161292c\") " Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.889320 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-utilities" (OuterVolumeSpecName: "utilities") pod "de3a8b86-9557-46d6-b146-bc402161292c" (UID: "de3a8b86-9557-46d6-b146-bc402161292c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.896812 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3a8b86-9557-46d6-b146-bc402161292c-kube-api-access-bx9ph" (OuterVolumeSpecName: "kube-api-access-bx9ph") pod "de3a8b86-9557-46d6-b146-bc402161292c" (UID: "de3a8b86-9557-46d6-b146-bc402161292c"). InnerVolumeSpecName "kube-api-access-bx9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.989940 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx9ph\" (UniqueName: \"kubernetes.io/projected/de3a8b86-9557-46d6-b146-bc402161292c-kube-api-access-bx9ph\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:02 crc kubenswrapper[4792]: I0318 15:39:02.990016 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.026389 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de3a8b86-9557-46d6-b146-bc402161292c" (UID: "de3a8b86-9557-46d6-b146-bc402161292c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.063527 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.090667 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de3a8b86-9557-46d6-b146-bc402161292c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.500845 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" event={"ID":"97bf7ace-de36-42fc-aca7-8f9fe2bed94c","Type":"ContainerStarted","Data":"39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d"} Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.500904 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.502722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fd6pb" event={"ID":"de3a8b86-9557-46d6-b146-bc402161292c","Type":"ContainerDied","Data":"81bd1978217df14fda7d9d4d6b03dc067411d72b342d86101e9cee5b73751d52"} Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.502835 4792 scope.go:117] "RemoveContainer" containerID="0ef141ef393f3295f1a045ade6b883541629b48a921c8e69f10ca34860c4e412" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.502762 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fd6pb" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.505990 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.518169 4792 scope.go:117] "RemoveContainer" containerID="70b4c191d45098c97322d0ce6969cd1a2a5e887c353bff69d794c54e2167da52" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.522852 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" podStartSLOduration=4.52283085 podStartE2EDuration="4.52283085s" podCreationTimestamp="2026-03-18 15:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:03.522018744 +0000 UTC m=+292.391347691" watchObservedRunningTime="2026-03-18 15:39:03.52283085 +0000 UTC m=+292.392159807" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.545490 4792 scope.go:117] "RemoveContainer" containerID="a82e6428fb597054018b209982772e46e6797d95ac347940042d3068045e522d" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.560560 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fd6pb"] Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.566156 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fd6pb"] Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.861306 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3a8b86-9557-46d6-b146-bc402161292c" path="/var/lib/kubelet/pods/de3a8b86-9557-46d6-b146-bc402161292c/volumes" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.862549 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.862745 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="extract-content" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.862763 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="extract-content" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.862773 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="extract-utilities" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.862781 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="extract-utilities" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.862803 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="registry-server" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.862811 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="registry-server" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.862933 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3a8b86-9557-46d6-b146-bc402161292c" containerName="registry-server" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863406 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863498 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863635 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863793 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b" gracePeriod=15 Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863851 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac" gracePeriod=15 Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863895 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2" gracePeriod=15 Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863848 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068" gracePeriod=15 Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.863962 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0" gracePeriod=15 Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864236 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864254 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864269 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864279 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864292 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864301 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864311 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864318 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864328 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864335 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864344 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864352 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864367 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864374 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864389 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864396 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864406 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864413 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: E0318 15:39:03.864424 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864431 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864551 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864567 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864578 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864591 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864599 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864608 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864616 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864626 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.864844 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.867012 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.900236 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.900279 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.900560 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.900824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.900902 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.900939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.901015 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:03 crc kubenswrapper[4792]: I0318 15:39:03.901425 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003027 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003069 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003103 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003224 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003289 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.003683 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.004125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.004234 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.004264 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.004248 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.004287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:04 crc kubenswrapper[4792]: E0318 15:39:04.126572 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podcb059e12_7b45_4382_a603_a79a9261d608.slice/crio-3be7f9c354fac75524479c7a64522049921269f33c78f80163d66992409e7cf3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-podcb059e12_7b45_4382_a603_a79a9261d608.slice/crio-conmon-3be7f9c354fac75524479c7a64522049921269f33c78f80163d66992409e7cf3.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.509823 4792 generic.go:334] "Generic (PLEG): container finished" podID="cb059e12-7b45-4382-a603-a79a9261d608" containerID="3be7f9c354fac75524479c7a64522049921269f33c78f80163d66992409e7cf3" exitCode=0 Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.509957 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb059e12-7b45-4382-a603-a79a9261d608","Type":"ContainerDied","Data":"3be7f9c354fac75524479c7a64522049921269f33c78f80163d66992409e7cf3"} Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.511025 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.513418 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.515100 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.515887 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2" exitCode=0 Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.515913 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068" exitCode=0 Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.515962 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac" exitCode=0 Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.515984 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0" exitCode=2 Mar 18 15:39:04 crc kubenswrapper[4792]: I0318 15:39:04.516022 4792 scope.go:117] "RemoveContainer" containerID="5c15b3cd51d10abfc6542297eb4f4384fc08386b4f76403fea1d7dc7275bab44" Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.524082 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.802582 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.803598 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.924418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-kubelet-dir\") pod \"cb059e12-7b45-4382-a603-a79a9261d608\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.924864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb059e12-7b45-4382-a603-a79a9261d608-kube-api-access\") pod \"cb059e12-7b45-4382-a603-a79a9261d608\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.924910 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-var-lock\") pod \"cb059e12-7b45-4382-a603-a79a9261d608\" (UID: \"cb059e12-7b45-4382-a603-a79a9261d608\") " Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.925339 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-var-lock" (OuterVolumeSpecName: "var-lock") pod "cb059e12-7b45-4382-a603-a79a9261d608" (UID: "cb059e12-7b45-4382-a603-a79a9261d608"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.933895 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cb059e12-7b45-4382-a603-a79a9261d608" (UID: "cb059e12-7b45-4382-a603-a79a9261d608"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:05 crc kubenswrapper[4792]: I0318 15:39:05.945367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb059e12-7b45-4382-a603-a79a9261d608-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cb059e12-7b45-4382-a603-a79a9261d608" (UID: "cb059e12-7b45-4382-a603-a79a9261d608"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.026599 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.026630 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb059e12-7b45-4382-a603-a79a9261d608-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.026641 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb059e12-7b45-4382-a603-a79a9261d608-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.077537 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.078178 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.078608 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.078805 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.078951 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.079012 4792 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.079171 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.280583 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.533558 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.535846 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b" exitCode=0 Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.537773 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb059e12-7b45-4382-a603-a79a9261d608","Type":"ContainerDied","Data":"5b42605c9b996cba7ec5ee5822b5a8577090f1c50518b2e7dd733c56a9cd8ade"} Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.537813 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b42605c9b996cba7ec5ee5822b5a8577090f1c50518b2e7dd733c56a9cd8ade" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.537822 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.552812 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:06 crc kubenswrapper[4792]: E0318 15:39:06.681632 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.851802 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.853255 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.853934 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:06 crc kubenswrapper[4792]: I0318 15:39:06.854722 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042481 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042584 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042625 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042659 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042707 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042928 4792 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042945 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.042956 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:07 crc kubenswrapper[4792]: E0318 15:39:07.483014 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.546406 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.547382 4792 scope.go:117] "RemoveContainer" containerID="2434efa7e18ee64bc44eb27bf8032bfcb9f576ef15687350cab1ef2107b4e6d2" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.547482 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.560400 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.560851 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.563848 4792 scope.go:117] "RemoveContainer" containerID="e509d503aaa8f10f90f6b8882e5aa195e540b750f86b07a42bffae9ec4f9a068" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.578211 4792 scope.go:117] "RemoveContainer" containerID="168bb0352495e07178dffe2e788888a1f67966fc155f599c1f01637caf473bac" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.590720 4792 scope.go:117] "RemoveContainer" containerID="feb2f2608bf9411707251c82c7050a7f67b310fd390250645d1e4f4c07767bf0" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.606088 4792 scope.go:117] "RemoveContainer" containerID="1abccc2d96e1ca9a055622a2085ac8d2716e4b38b69a1a5af97fbd47c0a2a19b" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.619252 4792 scope.go:117] "RemoveContainer" containerID="7e548a7add90a17e7370f754c2ab1fdc225ab6ff30d5c5d96d8da42f2a8b403e" Mar 18 15:39:07 crc kubenswrapper[4792]: I0318 15:39:07.860481 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 15:39:08 crc kubenswrapper[4792]: E0318 15:39:08.893215 4792 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:08 crc kubenswrapper[4792]: I0318 15:39:08.893738 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:08 crc kubenswrapper[4792]: E0318 15:39:08.925031 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189df9acd30dfe17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:39:08.924444183 +0000 UTC m=+297.793773160,LastTimestamp:2026-03-18 15:39:08.924444183 +0000 UTC m=+297.793773160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:39:09 crc kubenswrapper[4792]: E0318 15:39:09.083923 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 18 15:39:09 crc kubenswrapper[4792]: I0318 15:39:09.558639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0"} Mar 18 15:39:09 crc kubenswrapper[4792]: I0318 15:39:09.558908 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f41eb68002d8c26e9b9fa46d854eb78cc68a4553deebe3c082a82a9aa83ce11e"} Mar 18 15:39:09 crc kubenswrapper[4792]: E0318 15:39:09.559612 4792 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:09 crc kubenswrapper[4792]: I0318 15:39:09.559608 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:11 crc kubenswrapper[4792]: I0318 15:39:11.721851 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" containerName="oauth-openshift" containerID="cri-o://5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719" gracePeriod=15 Mar 18 15:39:11 crc kubenswrapper[4792]: I0318 15:39:11.856874 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.284482 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:39:12 crc kubenswrapper[4792]: E0318 15:39:12.284660 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="6.4s" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.285309 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.285604 4792 status_manager.go:851] "Failed to get status for pod" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-svr4m\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.318176 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-trusted-ca-bundle\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.319406 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419379 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-router-certs\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419452 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-ocp-branding-template\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419484 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-session\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419520 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-login\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419544 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-serving-cert\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419586 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-provider-selection\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419616 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-policies\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w88t6\" (UniqueName: \"kubernetes.io/projected/c51bb642-174a-4bb3-8a20-1708d490a17d-kube-api-access-w88t6\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419663 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-cliconfig\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419695 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-idp-0-file-data\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419728 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-dir\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419751 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-error\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.419787 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-service-ca\") pod \"c51bb642-174a-4bb3-8a20-1708d490a17d\" (UID: \"c51bb642-174a-4bb3-8a20-1708d490a17d\") " Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.420136 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.420679 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.420787 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.421125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.421314 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.421335 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.421349 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.421361 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.421373 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c51bb642-174a-4bb3-8a20-1708d490a17d-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.426935 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.433338 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.433376 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51bb642-174a-4bb3-8a20-1708d490a17d-kube-api-access-w88t6" (OuterVolumeSpecName: "kube-api-access-w88t6") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "kube-api-access-w88t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.433793 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.433885 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.434104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.434271 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.434451 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.435102 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c51bb642-174a-4bb3-8a20-1708d490a17d" (UID: "c51bb642-174a-4bb3-8a20-1708d490a17d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522657 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522836 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522856 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522868 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w88t6\" (UniqueName: \"kubernetes.io/projected/c51bb642-174a-4bb3-8a20-1708d490a17d-kube-api-access-w88t6\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522879 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522888 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522900 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522909 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.522920 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c51bb642-174a-4bb3-8a20-1708d490a17d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.577125 4792 generic.go:334] "Generic (PLEG): container finished" podID="c51bb642-174a-4bb3-8a20-1708d490a17d" containerID="5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719" exitCode=0 Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.577199 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" event={"ID":"c51bb642-174a-4bb3-8a20-1708d490a17d","Type":"ContainerDied","Data":"5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719"} Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.577260 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.577295 4792 scope.go:117] "RemoveContainer" containerID="5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.577278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" event={"ID":"c51bb642-174a-4bb3-8a20-1708d490a17d","Type":"ContainerDied","Data":"468e84f2411aca2940d0735a19f6d50dad846138b633f4d4efcf82fa8a8725ca"} Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.578068 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.579340 4792 status_manager.go:851] "Failed to get status for pod" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-svr4m\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.595959 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.596345 4792 status_manager.go:851] "Failed to get status for pod" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-svr4m\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.599986 4792 scope.go:117] "RemoveContainer" containerID="5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719" Mar 18 15:39:12 crc kubenswrapper[4792]: E0318 15:39:12.600452 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719\": container with ID starting with 5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719 not found: ID does not exist" containerID="5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719" Mar 18 15:39:12 crc kubenswrapper[4792]: I0318 15:39:12.600518 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719"} err="failed to get container status \"5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719\": rpc error: code = NotFound desc = could not find container \"5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719\": container with ID starting with 5512123a5d1b6e3510b0059d63d2af5793ec3e98d4ee63092553d22dcb744719 not found: ID does not exist" Mar 18 15:39:14 crc kubenswrapper[4792]: E0318 15:39:14.284060 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189df9acd30dfe17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 15:39:08.924444183 +0000 UTC m=+297.793773160,LastTimestamp:2026-03-18 15:39:08.924444183 +0000 UTC m=+297.793773160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.608521 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.609146 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.609198 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4" exitCode=1 Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.609241 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4"} Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.609720 4792 scope.go:117] "RemoveContainer" containerID="07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4" Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.610050 4792 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.610401 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:17 crc kubenswrapper[4792]: I0318 15:39:17.610706 4792 status_manager.go:851] "Failed to get status for pod" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-svr4m\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:17 crc kubenswrapper[4792]: E0318 15:39:17.920818 4792 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" volumeName="registry-storage" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.616571 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.617117 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.617165 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8ec9fdc43dd0c98d404cb7c336f638c250927d8cf6308efa9ac38d9f4433d8b7"} Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.618010 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.618434 4792 status_manager.go:851] "Failed to get status for pod" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-svr4m\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.618722 4792 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:18 crc kubenswrapper[4792]: E0318 15:39:18.685472 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="7s" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.853964 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.855197 4792 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.856165 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.856736 4792 status_manager.go:851] "Failed to get status for pod" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-svr4m\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.868893 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.868938 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:18 crc kubenswrapper[4792]: E0318 15:39:18.869410 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:18 crc kubenswrapper[4792]: I0318 15:39:18.870073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:18 crc kubenswrapper[4792]: W0318 15:39:18.896820 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-a7cd6d63cc206ebbd6ba7ebd09a1290798e6f9511fa4b1958b0dd6e6fcdca2a1 WatchSource:0}: Error finding container a7cd6d63cc206ebbd6ba7ebd09a1290798e6f9511fa4b1958b0dd6e6fcdca2a1: Status 404 returned error can't find the container with id a7cd6d63cc206ebbd6ba7ebd09a1290798e6f9511fa4b1958b0dd6e6fcdca2a1 Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.626357 4792 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8c20754ad2afc1b31eef75e416b204c448ae84e379eaaf3541e48f4f76cef67f" exitCode=0 Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.626484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8c20754ad2afc1b31eef75e416b204c448ae84e379eaaf3541e48f4f76cef67f"} Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.626642 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a7cd6d63cc206ebbd6ba7ebd09a1290798e6f9511fa4b1958b0dd6e6fcdca2a1"} Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.626950 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.626985 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:19 crc kubenswrapper[4792]: E0318 15:39:19.627450 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.627542 4792 status_manager.go:851] "Failed to get status for pod" podUID="cb059e12-7b45-4382-a603-a79a9261d608" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.632218 4792 status_manager.go:851] "Failed to get status for pod" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" pod="openshift-authentication/oauth-openshift-558db77b4-svr4m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-svr4m\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:19 crc kubenswrapper[4792]: I0318 15:39:19.632535 4792 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 18 15:39:20 crc kubenswrapper[4792]: I0318 15:39:20.639697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b8481c029bcdeaf866164847fdf7b0dae4677931da2451df0d367d51bde4ef3d"} Mar 18 15:39:20 crc kubenswrapper[4792]: I0318 15:39:20.639935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6e56181908f0400a556a4aaf66f70834a8b44cb0b2331bb4ae97485727c9e838"} Mar 18 15:39:20 crc kubenswrapper[4792]: I0318 15:39:20.639945 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"eb758047e422a7a032a041817f341429a6c388f5079dc9df1a440275502c97e1"} Mar 18 15:39:20 crc kubenswrapper[4792]: I0318 15:39:20.639953 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8e4192b2c274235d2847429486d540639a63f3f7d47bd6ebdc070ead7e3d777d"} Mar 18 15:39:21 crc kubenswrapper[4792]: I0318 15:39:21.648953 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ff514ebeafdf06f78e3c502cbfef87e4321497a19fa0f6608b32cd9c118b8598"} Mar 18 15:39:21 crc kubenswrapper[4792]: I0318 15:39:21.649237 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:21 crc kubenswrapper[4792]: I0318 15:39:21.649375 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:21 crc kubenswrapper[4792]: I0318 15:39:21.649403 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:23 crc kubenswrapper[4792]: I0318 15:39:23.510669 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:39:23 crc kubenswrapper[4792]: I0318 15:39:23.870421 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:23 crc kubenswrapper[4792]: I0318 15:39:23.870462 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:23 crc kubenswrapper[4792]: I0318 15:39:23.876924 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:24 crc kubenswrapper[4792]: I0318 15:39:24.386836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:39:24 crc kubenswrapper[4792]: I0318 15:39:24.390954 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:39:26 crc kubenswrapper[4792]: I0318 15:39:26.659094 4792 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:26 crc kubenswrapper[4792]: I0318 15:39:26.674821 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:26 crc kubenswrapper[4792]: I0318 15:39:26.674852 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:26 crc kubenswrapper[4792]: I0318 15:39:26.679720 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:26 crc kubenswrapper[4792]: I0318 15:39:26.730323 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ed69519f-3e66-4497-af53-e529afe80bbd" Mar 18 15:39:27 crc kubenswrapper[4792]: I0318 15:39:27.679259 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:27 crc kubenswrapper[4792]: I0318 15:39:27.679287 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e46cf24-bfda-4bce-86e4-42a4b755b41f" Mar 18 15:39:27 crc kubenswrapper[4792]: I0318 15:39:27.682020 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ed69519f-3e66-4497-af53-e529afe80bbd" Mar 18 15:39:33 crc kubenswrapper[4792]: I0318 15:39:33.515227 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 15:39:36 crc kubenswrapper[4792]: I0318 15:39:36.062830 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:39:36 crc kubenswrapper[4792]: I0318 15:39:36.249906 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 15:39:36 crc kubenswrapper[4792]: I0318 15:39:36.603198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 15:39:36 crc kubenswrapper[4792]: I0318 15:39:36.938262 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 15:39:37 crc kubenswrapper[4792]: I0318 15:39:37.173564 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 15:39:37 crc kubenswrapper[4792]: I0318 15:39:37.240174 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 15:39:37 crc kubenswrapper[4792]: I0318 15:39:37.457012 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 15:39:37 crc kubenswrapper[4792]: I0318 15:39:37.652232 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 15:39:37 crc kubenswrapper[4792]: I0318 15:39:37.992132 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 15:39:37 crc kubenswrapper[4792]: I0318 15:39:37.998064 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.001075 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.098720 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.258691 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.266359 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svr4m","openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.266443 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.270615 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.285931 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.285906077 podStartE2EDuration="12.285906077s" podCreationTimestamp="2026-03-18 15:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:39:38.28344931 +0000 UTC m=+327.152778257" watchObservedRunningTime="2026-03-18 15:39:38.285906077 +0000 UTC m=+327.155235014" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.488319 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.675186 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.885885 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.927842 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.942340 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 15:39:38 crc kubenswrapper[4792]: I0318 15:39:38.998415 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.031598 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.070181 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.080012 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.147486 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.265435 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.292225 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.480819 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.506916 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.525681 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.636784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.832497 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 15:39:39 crc kubenswrapper[4792]: I0318 15:39:39.860932 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" path="/var/lib/kubelet/pods/c51bb642-174a-4bb3-8a20-1708d490a17d/volumes" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.046339 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.107859 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.267475 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.272737 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.303485 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.313455 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.338854 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.342438 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.385341 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.389549 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.450263 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.522460 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.540145 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.622964 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.695460 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.748656 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.768067 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.858527 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.886022 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.887639 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.930468 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.948404 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:39:40 crc kubenswrapper[4792]: I0318 15:39:40.978073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.203911 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.238636 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.280290 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.386562 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.415892 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.451450 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.499567 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.505708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.527734 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.542834 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.590686 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.649622 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.849460 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.854062 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.911200 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 15:39:41 crc kubenswrapper[4792]: I0318 15:39:41.963924 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.047894 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.108036 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.149919 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.152129 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.207235 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.209180 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.221360 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.255271 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.256116 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.291330 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.327545 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.342730 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.343643 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.417465 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.540465 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.567415 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.599216 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.647556 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.723733 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.760763 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.797183 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.814474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.827531 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 15:39:42 crc kubenswrapper[4792]: I0318 15:39:42.967745 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.017763 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.029556 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.040940 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.196856 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.223091 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.271984 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.313749 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.325517 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.341189 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.344861 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.380498 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.380731 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.401778 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.477858 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.546911 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.550303 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.559238 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.597018 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.626000 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.644223 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.683567 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.709610 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.851401 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.932144 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 15:39:43 crc kubenswrapper[4792]: I0318 15:39:43.946993 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.003100 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.048086 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.075910 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.076073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.164206 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.184528 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.289338 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.346153 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.355029 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.430049 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.513068 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.530715 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.544506 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.550506 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.584578 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.653665 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.915150 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:39:44 crc kubenswrapper[4792]: I0318 15:39:44.997889 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.186693 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.295288 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.296197 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.382693 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.488320 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.582736 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.605884 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.610006 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.635591 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.726235 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.818616 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.836872 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:39:45 crc kubenswrapper[4792]: I0318 15:39:45.861676 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.037141 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.133483 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.191206 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.308006 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.396877 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.417387 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.544935 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.585481 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.592805 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.607057 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.607800 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.610948 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.625441 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.663071 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.672453 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.683231 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.723960 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.821731 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.930617 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 15:39:46 crc kubenswrapper[4792]: I0318 15:39:46.940611 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.007759 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.041918 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.125660 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.153924 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.192622 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.242655 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.246622 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.266035 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.425454 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.440935 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.514136 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.545745 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.552307 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.587657 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.684037 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.764093 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.785713 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.844464 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.949198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 15:39:47 crc kubenswrapper[4792]: I0318 15:39:47.954455 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.259070 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.304273 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.323743 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.379021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.445948 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.502504 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.513629 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.581867 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.598176 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.632323 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.725788 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.922412 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.923157 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 15:39:48 crc kubenswrapper[4792]: I0318 15:39:48.930335 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.156094 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.192274 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.192726 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0" gracePeriod=5 Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.215112 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.255322 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.320199 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.474213 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.611473 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.644136 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.727333 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.739225 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.754896 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.819092 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.819711 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 15:39:49 crc kubenswrapper[4792]: I0318 15:39:49.959138 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.146591 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.219246 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.230592 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.265030 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.269799 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.479246 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.655099 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 15:39:50 crc kubenswrapper[4792]: I0318 15:39:50.804477 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.029148 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.281070 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.317484 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.417454 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.767452 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.792273 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.913672 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 15:39:51 crc kubenswrapper[4792]: I0318 15:39:51.984578 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 15:39:52 crc kubenswrapper[4792]: I0318 15:39:52.712722 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.761672 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.761744 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.804037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.804097 4792 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0" exitCode=137 Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.804146 4792 scope.go:117] "RemoveContainer" containerID="9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.804224 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.821573 4792 scope.go:117] "RemoveContainer" containerID="9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0" Mar 18 15:39:54 crc kubenswrapper[4792]: E0318 15:39:54.822072 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0\": container with ID starting with 9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0 not found: ID does not exist" containerID="9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.822128 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0"} err="failed to get container status \"9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0\": rpc error: code = NotFound desc = could not find container \"9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0\": container with ID starting with 9835e2fdded51bb445a9b9a60992fc276604a7d830ea77559c506d5d4511b2b0 not found: ID does not exist" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.954523 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.954616 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.954920 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955042 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955082 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955132 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955529 4792 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955558 4792 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955576 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.955592 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:54 crc kubenswrapper[4792]: I0318 15:39:54.961992 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:39:55 crc kubenswrapper[4792]: I0318 15:39:55.056371 4792 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 15:39:55 crc kubenswrapper[4792]: I0318 15:39:55.860694 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 15:39:59 crc kubenswrapper[4792]: I0318 15:39:59.860180 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558645b857-5vrs9"] Mar 18 15:39:59 crc kubenswrapper[4792]: I0318 15:39:59.860623 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" podUID="97bf7ace-de36-42fc-aca7-8f9fe2bed94c" containerName="controller-manager" containerID="cri-o://39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d" gracePeriod=30 Mar 18 15:39:59 crc kubenswrapper[4792]: I0318 15:39:59.862890 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv"] Mar 18 15:39:59 crc kubenswrapper[4792]: I0318 15:39:59.863145 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" podUID="473ce97d-c7f0-4ec4-8530-c46179d81c30" containerName="route-controller-manager" containerID="cri-o://e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8" gracePeriod=30 Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.165378 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564140-x84rj"] Mar 18 15:40:00 crc kubenswrapper[4792]: E0318 15:40:00.165643 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" containerName="oauth-openshift" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.165657 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" containerName="oauth-openshift" Mar 18 15:40:00 crc kubenswrapper[4792]: E0318 15:40:00.165678 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb059e12-7b45-4382-a603-a79a9261d608" containerName="installer" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.165685 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb059e12-7b45-4382-a603-a79a9261d608" containerName="installer" Mar 18 15:40:00 crc kubenswrapper[4792]: E0318 15:40:00.165700 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.165707 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.165818 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb059e12-7b45-4382-a603-a79a9261d608" containerName="installer" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.165827 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c51bb642-174a-4bb3-8a20-1708d490a17d" containerName="oauth-openshift" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.165837 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.166272 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-x84rj" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.168548 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.168753 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.170460 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.171864 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-x84rj"] Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.276011 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.280765 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.316395 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grj88\" (UniqueName: \"kubernetes.io/projected/c03d9dab-5256-4887-8fed-296b586402c3-kube-api-access-grj88\") pod \"auto-csr-approver-29564140-x84rj\" (UID: \"c03d9dab-5256-4887-8fed-296b586402c3\") " pod="openshift-infra/auto-csr-approver-29564140-x84rj" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.416935 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-client-ca\") pod \"473ce97d-c7f0-4ec4-8530-c46179d81c30\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417011 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-proxy-ca-bundles\") pod \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417052 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-serving-cert\") pod \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417099 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-config\") pod \"473ce97d-c7f0-4ec4-8530-c46179d81c30\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417194 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/473ce97d-c7f0-4ec4-8530-c46179d81c30-serving-cert\") pod \"473ce97d-c7f0-4ec4-8530-c46179d81c30\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417228 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwqjk\" (UniqueName: \"kubernetes.io/projected/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-kube-api-access-gwqjk\") pod \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417250 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-client-ca\") pod \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417276 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247ks\" (UniqueName: \"kubernetes.io/projected/473ce97d-c7f0-4ec4-8530-c46179d81c30-kube-api-access-247ks\") pod \"473ce97d-c7f0-4ec4-8530-c46179d81c30\" (UID: \"473ce97d-c7f0-4ec4-8530-c46179d81c30\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417299 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-config\") pod \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\" (UID: \"97bf7ace-de36-42fc-aca7-8f9fe2bed94c\") " Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.417385 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grj88\" (UniqueName: \"kubernetes.io/projected/c03d9dab-5256-4887-8fed-296b586402c3-kube-api-access-grj88\") pod \"auto-csr-approver-29564140-x84rj\" (UID: \"c03d9dab-5256-4887-8fed-296b586402c3\") " pod="openshift-infra/auto-csr-approver-29564140-x84rj" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.418544 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-client-ca" (OuterVolumeSpecName: "client-ca") pod "473ce97d-c7f0-4ec4-8530-c46179d81c30" (UID: "473ce97d-c7f0-4ec4-8530-c46179d81c30"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.419479 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-config" (OuterVolumeSpecName: "config") pod "97bf7ace-de36-42fc-aca7-8f9fe2bed94c" (UID: "97bf7ace-de36-42fc-aca7-8f9fe2bed94c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.420105 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-config" (OuterVolumeSpecName: "config") pod "473ce97d-c7f0-4ec4-8530-c46179d81c30" (UID: "473ce97d-c7f0-4ec4-8530-c46179d81c30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.420201 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "97bf7ace-de36-42fc-aca7-8f9fe2bed94c" (UID: "97bf7ace-de36-42fc-aca7-8f9fe2bed94c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.420312 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-client-ca" (OuterVolumeSpecName: "client-ca") pod "97bf7ace-de36-42fc-aca7-8f9fe2bed94c" (UID: "97bf7ace-de36-42fc-aca7-8f9fe2bed94c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.423464 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ce97d-c7f0-4ec4-8530-c46179d81c30-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "473ce97d-c7f0-4ec4-8530-c46179d81c30" (UID: "473ce97d-c7f0-4ec4-8530-c46179d81c30"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.423482 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-kube-api-access-gwqjk" (OuterVolumeSpecName: "kube-api-access-gwqjk") pod "97bf7ace-de36-42fc-aca7-8f9fe2bed94c" (UID: "97bf7ace-de36-42fc-aca7-8f9fe2bed94c"). InnerVolumeSpecName "kube-api-access-gwqjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.423647 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97bf7ace-de36-42fc-aca7-8f9fe2bed94c" (UID: "97bf7ace-de36-42fc-aca7-8f9fe2bed94c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.423726 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473ce97d-c7f0-4ec4-8530-c46179d81c30-kube-api-access-247ks" (OuterVolumeSpecName: "kube-api-access-247ks") pod "473ce97d-c7f0-4ec4-8530-c46179d81c30" (UID: "473ce97d-c7f0-4ec4-8530-c46179d81c30"). InnerVolumeSpecName "kube-api-access-247ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.433100 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grj88\" (UniqueName: \"kubernetes.io/projected/c03d9dab-5256-4887-8fed-296b586402c3-kube-api-access-grj88\") pod \"auto-csr-approver-29564140-x84rj\" (UID: \"c03d9dab-5256-4887-8fed-296b586402c3\") " pod="openshift-infra/auto-csr-approver-29564140-x84rj" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.488112 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-x84rj" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.518679 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/473ce97d-c7f0-4ec4-8530-c46179d81c30-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519024 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwqjk\" (UniqueName: \"kubernetes.io/projected/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-kube-api-access-gwqjk\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519173 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519315 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247ks\" (UniqueName: \"kubernetes.io/projected/473ce97d-c7f0-4ec4-8530-c46179d81c30-kube-api-access-247ks\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519445 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519563 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519675 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519799 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bf7ace-de36-42fc-aca7-8f9fe2bed94c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.519911 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/473ce97d-c7f0-4ec4-8530-c46179d81c30-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.533587 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86567d79f8-v9v86"] Mar 18 15:40:00 crc kubenswrapper[4792]: E0318 15:40:00.534095 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ce97d-c7f0-4ec4-8530-c46179d81c30" containerName="route-controller-manager" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.534198 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ce97d-c7f0-4ec4-8530-c46179d81c30" containerName="route-controller-manager" Mar 18 15:40:00 crc kubenswrapper[4792]: E0318 15:40:00.534292 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97bf7ace-de36-42fc-aca7-8f9fe2bed94c" containerName="controller-manager" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.534361 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="97bf7ace-de36-42fc-aca7-8f9fe2bed94c" containerName="controller-manager" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.534540 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="97bf7ace-de36-42fc-aca7-8f9fe2bed94c" containerName="controller-manager" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.534626 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ce97d-c7f0-4ec4-8530-c46179d81c30" containerName="route-controller-manager" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.535055 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.542390 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.542597 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.542720 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.542944 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.543372 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.546607 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.548035 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.548240 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.548571 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.550471 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.550663 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.550808 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.551634 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86567d79f8-v9v86"] Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.561319 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.576907 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.588890 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.723710 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724117 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-error\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724180 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-login\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724207 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724261 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-audit-dir\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724288 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nh2g\" (UniqueName: \"kubernetes.io/projected/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-kube-api-access-4nh2g\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724344 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-audit-policies\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724399 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-session\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-router-certs\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724541 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.724563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-service-ca\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826168 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-router-certs\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826193 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826217 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-service-ca\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826283 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-error\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826341 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-login\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826403 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-audit-dir\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826425 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nh2g\" (UniqueName: \"kubernetes.io/projected/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-kube-api-access-4nh2g\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826451 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-audit-policies\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.826538 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-session\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.827309 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-audit-dir\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.827668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.828542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-audit-policies\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.828721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.828959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-service-ca\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.830917 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-session\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.831113 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-login\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.831634 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-error\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.831686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.831864 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.832198 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.832598 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-router-certs\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.832883 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.838005 4792 generic.go:334] "Generic (PLEG): container finished" podID="97bf7ace-de36-42fc-aca7-8f9fe2bed94c" containerID="39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d" exitCode=0 Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.838241 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" event={"ID":"97bf7ace-de36-42fc-aca7-8f9fe2bed94c","Type":"ContainerDied","Data":"39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d"} Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.838612 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" event={"ID":"97bf7ace-de36-42fc-aca7-8f9fe2bed94c","Type":"ContainerDied","Data":"22bfa5b3c8db90878eda0e2d5692c83a8ef8b9233d90b373b1f1bf8a94de0a8c"} Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.838852 4792 scope.go:117] "RemoveContainer" containerID="39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.838883 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558645b857-5vrs9" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.842610 4792 generic.go:334] "Generic (PLEG): container finished" podID="473ce97d-c7f0-4ec4-8530-c46179d81c30" containerID="e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8" exitCode=0 Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.842629 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.842658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" event={"ID":"473ce97d-c7f0-4ec4-8530-c46179d81c30","Type":"ContainerDied","Data":"e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8"} Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.842684 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv" event={"ID":"473ce97d-c7f0-4ec4-8530-c46179d81c30","Type":"ContainerDied","Data":"dd1cd1fcb17cf50cbc4cad18488ceade7fccd451eda9541ace6dbb5408b4252a"} Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.857442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nh2g\" (UniqueName: \"kubernetes.io/projected/f7b32fdd-97ca-4c56-8981-4bdff318a1e1-kube-api-access-4nh2g\") pod \"oauth-openshift-86567d79f8-v9v86\" (UID: \"f7b32fdd-97ca-4c56-8981-4bdff318a1e1\") " pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.859850 4792 scope.go:117] "RemoveContainer" containerID="39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d" Mar 18 15:40:00 crc kubenswrapper[4792]: E0318 15:40:00.860552 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d\": container with ID starting with 39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d not found: ID does not exist" containerID="39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.860587 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d"} err="failed to get container status \"39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d\": rpc error: code = NotFound desc = could not find container \"39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d\": container with ID starting with 39b4e619274b8852121609f586b3bade314003bddfc341848553f30b3524233d not found: ID does not exist" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.860613 4792 scope.go:117] "RemoveContainer" containerID="e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.875717 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv"] Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.879348 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.884750 4792 scope.go:117] "RemoveContainer" containerID="e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8" Mar 18 15:40:00 crc kubenswrapper[4792]: E0318 15:40:00.885263 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8\": container with ID starting with e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8 not found: ID does not exist" containerID="e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.885324 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8"} err="failed to get container status \"e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8\": rpc error: code = NotFound desc = could not find container \"e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8\": container with ID starting with e283ec64179a2ba2f80d7602659e4cd61fa0a4580aaea7808826893495e046d8 not found: ID does not exist" Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.887126 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dcf467c5-7kvkv"] Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.893479 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-x84rj"] Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.897146 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558645b857-5vrs9"] Mar 18 15:40:00 crc kubenswrapper[4792]: I0318 15:40:00.900735 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-558645b857-5vrs9"] Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.328210 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86567d79f8-v9v86"] Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.531478 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-8j4z9"] Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.532677 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.536869 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd"] Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.537656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.539710 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.540086 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.540256 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.540659 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.540663 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.540861 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.540908 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.541201 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.541213 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.541354 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.542887 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.543244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.544263 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd"] Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.545395 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.550914 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-8j4z9"] Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.634404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt2m8\" (UniqueName: \"kubernetes.io/projected/af9f0abe-28aa-46e5-8582-148d4b71ea5b-kube-api-access-zt2m8\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.634475 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-proxy-ca-bundles\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.634513 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f0abe-28aa-46e5-8582-148d4b71ea5b-serving-cert\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.634533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-client-ca\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.634581 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-config\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-proxy-ca-bundles\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736265 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/539835fd-4134-4cab-8c05-f7df74b38042-config\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f0abe-28aa-46e5-8582-148d4b71ea5b-serving-cert\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736381 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-client-ca\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/539835fd-4134-4cab-8c05-f7df74b38042-serving-cert\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-config\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/539835fd-4134-4cab-8c05-f7df74b38042-client-ca\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736844 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkj7c\" (UniqueName: \"kubernetes.io/projected/539835fd-4134-4cab-8c05-f7df74b38042-kube-api-access-hkj7c\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.736925 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt2m8\" (UniqueName: \"kubernetes.io/projected/af9f0abe-28aa-46e5-8582-148d4b71ea5b-kube-api-access-zt2m8\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.740605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-proxy-ca-bundles\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.742789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-client-ca\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.746318 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-config\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.749458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f0abe-28aa-46e5-8582-148d4b71ea5b-serving-cert\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.757823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt2m8\" (UniqueName: \"kubernetes.io/projected/af9f0abe-28aa-46e5-8582-148d4b71ea5b-kube-api-access-zt2m8\") pod \"controller-manager-6477764b84-8j4z9\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.838605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkj7c\" (UniqueName: \"kubernetes.io/projected/539835fd-4134-4cab-8c05-f7df74b38042-kube-api-access-hkj7c\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.838698 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/539835fd-4134-4cab-8c05-f7df74b38042-config\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.838755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/539835fd-4134-4cab-8c05-f7df74b38042-serving-cert\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.838778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/539835fd-4134-4cab-8c05-f7df74b38042-client-ca\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.840056 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/539835fd-4134-4cab-8c05-f7df74b38042-client-ca\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.841826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/539835fd-4134-4cab-8c05-f7df74b38042-config\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.843033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/539835fd-4134-4cab-8c05-f7df74b38042-serving-cert\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.852882 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-x84rj" event={"ID":"c03d9dab-5256-4887-8fed-296b586402c3","Type":"ContainerStarted","Data":"70ad1246d0b5efbae117b457e04e01fbae30a7023c46bbea9800629d410f2b7e"} Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.856835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkj7c\" (UniqueName: \"kubernetes.io/projected/539835fd-4134-4cab-8c05-f7df74b38042-kube-api-access-hkj7c\") pod \"route-controller-manager-5c8d5cd46d-j96gd\" (UID: \"539835fd-4134-4cab-8c05-f7df74b38042\") " pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.860949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.862207 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473ce97d-c7f0-4ec4-8530-c46179d81c30" path="/var/lib/kubelet/pods/473ce97d-c7f0-4ec4-8530-c46179d81c30/volumes" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.863725 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97bf7ace-de36-42fc-aca7-8f9fe2bed94c" path="/var/lib/kubelet/pods/97bf7ace-de36-42fc-aca7-8f9fe2bed94c/volumes" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.870795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" event={"ID":"f7b32fdd-97ca-4c56-8981-4bdff318a1e1","Type":"ContainerStarted","Data":"10168ff951a284c9895622bfa17204f92cec700200ba956dfbb3754706a57c2a"} Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.870867 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.870890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" event={"ID":"f7b32fdd-97ca-4c56-8981-4bdff318a1e1","Type":"ContainerStarted","Data":"6d92e05902850b229d25203d9c60147f6a1c97a6134aba7ac4adcb7b17c42c9b"} Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.872577 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:01 crc kubenswrapper[4792]: I0318 15:40:01.944052 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podStartSLOduration=75.944031625 podStartE2EDuration="1m15.944031625s" podCreationTimestamp="2026-03-18 15:38:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:01.938232353 +0000 UTC m=+350.807561310" watchObservedRunningTime="2026-03-18 15:40:01.944031625 +0000 UTC m=+350.813360562" Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.060121 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.144488 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-8j4z9"] Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.289843 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd"] Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.864511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" event={"ID":"539835fd-4134-4cab-8c05-f7df74b38042","Type":"ContainerStarted","Data":"e7b9506860d932602869ba0047423b6c6cdbabd65989660202038098d0dda043"} Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.864821 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" event={"ID":"539835fd-4134-4cab-8c05-f7df74b38042","Type":"ContainerStarted","Data":"ed1e548c8d51a6f99107d0bb82e7025346707a34fbffbf558a28af62faf8aa63"} Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.864837 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.866102 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" event={"ID":"af9f0abe-28aa-46e5-8582-148d4b71ea5b","Type":"ContainerStarted","Data":"5f1aaccdaf7b0a89a0dacaba00f9dcdc3759c195af8e181d1b51c36c6ca30ca4"} Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.866135 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" event={"ID":"af9f0abe-28aa-46e5-8582-148d4b71ea5b","Type":"ContainerStarted","Data":"11caf470b85f46810f557f37924a695dd9483f5eb4f4b812274bf43a0929f5c0"} Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.866494 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.868035 4792 generic.go:334] "Generic (PLEG): container finished" podID="c03d9dab-5256-4887-8fed-296b586402c3" containerID="0a327326d3ab67e4f51b01b4312dd3a047657baf837bfe4096ff5e6f8af40251" exitCode=0 Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.868182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-x84rj" event={"ID":"c03d9dab-5256-4887-8fed-296b586402c3","Type":"ContainerDied","Data":"0a327326d3ab67e4f51b01b4312dd3a047657baf837bfe4096ff5e6f8af40251"} Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.870718 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.889652 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podStartSLOduration=3.889629148 podStartE2EDuration="3.889629148s" podCreationTimestamp="2026-03-18 15:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:02.885076596 +0000 UTC m=+351.754405543" watchObservedRunningTime="2026-03-18 15:40:02.889629148 +0000 UTC m=+351.758958085" Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.900999 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" podStartSLOduration=3.900980343 podStartE2EDuration="3.900980343s" podCreationTimestamp="2026-03-18 15:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:02.900299172 +0000 UTC m=+351.769628119" watchObservedRunningTime="2026-03-18 15:40:02.900980343 +0000 UTC m=+351.770309280" Mar 18 15:40:02 crc kubenswrapper[4792]: I0318 15:40:02.978056 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 15:40:04 crc kubenswrapper[4792]: I0318 15:40:04.128408 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-x84rj" Mar 18 15:40:04 crc kubenswrapper[4792]: I0318 15:40:04.270909 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grj88\" (UniqueName: \"kubernetes.io/projected/c03d9dab-5256-4887-8fed-296b586402c3-kube-api-access-grj88\") pod \"c03d9dab-5256-4887-8fed-296b586402c3\" (UID: \"c03d9dab-5256-4887-8fed-296b586402c3\") " Mar 18 15:40:04 crc kubenswrapper[4792]: I0318 15:40:04.281222 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03d9dab-5256-4887-8fed-296b586402c3-kube-api-access-grj88" (OuterVolumeSpecName: "kube-api-access-grj88") pod "c03d9dab-5256-4887-8fed-296b586402c3" (UID: "c03d9dab-5256-4887-8fed-296b586402c3"). InnerVolumeSpecName "kube-api-access-grj88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:04 crc kubenswrapper[4792]: I0318 15:40:04.373377 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grj88\" (UniqueName: \"kubernetes.io/projected/c03d9dab-5256-4887-8fed-296b586402c3-kube-api-access-grj88\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:04 crc kubenswrapper[4792]: I0318 15:40:04.882105 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564140-x84rj" event={"ID":"c03d9dab-5256-4887-8fed-296b586402c3","Type":"ContainerDied","Data":"70ad1246d0b5efbae117b457e04e01fbae30a7023c46bbea9800629d410f2b7e"} Mar 18 15:40:04 crc kubenswrapper[4792]: I0318 15:40:04.882163 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ad1246d0b5efbae117b457e04e01fbae30a7023c46bbea9800629d410f2b7e" Mar 18 15:40:04 crc kubenswrapper[4792]: I0318 15:40:04.882214 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564140-x84rj" Mar 18 15:40:12 crc kubenswrapper[4792]: I0318 15:40:12.386460 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 15:40:14 crc kubenswrapper[4792]: I0318 15:40:14.953857 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerID="dd31c588e6e21b3a6e8c6806ef68af5da9a7631eae231960a048f654e3580e1d" exitCode=0 Mar 18 15:40:14 crc kubenswrapper[4792]: I0318 15:40:14.954120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" event={"ID":"4a397c1a-6373-41ad-b12c-c56ff3afbff0","Type":"ContainerDied","Data":"dd31c588e6e21b3a6e8c6806ef68af5da9a7631eae231960a048f654e3580e1d"} Mar 18 15:40:14 crc kubenswrapper[4792]: I0318 15:40:14.954766 4792 scope.go:117] "RemoveContainer" containerID="dd31c588e6e21b3a6e8c6806ef68af5da9a7631eae231960a048f654e3580e1d" Mar 18 15:40:15 crc kubenswrapper[4792]: I0318 15:40:15.961200 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" event={"ID":"4a397c1a-6373-41ad-b12c-c56ff3afbff0","Type":"ContainerStarted","Data":"d881a5f5e0d26a03d4713efdd4311f25a26313e4fe7fd5490342ff2d315f0a49"} Mar 18 15:40:15 crc kubenswrapper[4792]: I0318 15:40:15.961875 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:40:15 crc kubenswrapper[4792]: I0318 15:40:15.963719 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:40:16 crc kubenswrapper[4792]: I0318 15:40:16.244658 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 15:40:19 crc kubenswrapper[4792]: I0318 15:40:19.824802 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-8j4z9"] Mar 18 15:40:19 crc kubenswrapper[4792]: I0318 15:40:19.825271 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" podUID="af9f0abe-28aa-46e5-8582-148d4b71ea5b" containerName="controller-manager" containerID="cri-o://5f1aaccdaf7b0a89a0dacaba00f9dcdc3759c195af8e181d1b51c36c6ca30ca4" gracePeriod=30 Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.165438 4792 generic.go:334] "Generic (PLEG): container finished" podID="af9f0abe-28aa-46e5-8582-148d4b71ea5b" containerID="5f1aaccdaf7b0a89a0dacaba00f9dcdc3759c195af8e181d1b51c36c6ca30ca4" exitCode=0 Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.165489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" event={"ID":"af9f0abe-28aa-46e5-8582-148d4b71ea5b","Type":"ContainerDied","Data":"5f1aaccdaf7b0a89a0dacaba00f9dcdc3759c195af8e181d1b51c36c6ca30ca4"} Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.359073 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.479868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f0abe-28aa-46e5-8582-148d4b71ea5b-serving-cert\") pod \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.479929 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt2m8\" (UniqueName: \"kubernetes.io/projected/af9f0abe-28aa-46e5-8582-148d4b71ea5b-kube-api-access-zt2m8\") pod \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.479958 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-config\") pod \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.480019 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-proxy-ca-bundles\") pod \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.480083 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-client-ca\") pod \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\" (UID: \"af9f0abe-28aa-46e5-8582-148d4b71ea5b\") " Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.480718 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "af9f0abe-28aa-46e5-8582-148d4b71ea5b" (UID: "af9f0abe-28aa-46e5-8582-148d4b71ea5b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.480808 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-config" (OuterVolumeSpecName: "config") pod "af9f0abe-28aa-46e5-8582-148d4b71ea5b" (UID: "af9f0abe-28aa-46e5-8582-148d4b71ea5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.480872 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-client-ca" (OuterVolumeSpecName: "client-ca") pod "af9f0abe-28aa-46e5-8582-148d4b71ea5b" (UID: "af9f0abe-28aa-46e5-8582-148d4b71ea5b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.485457 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af9f0abe-28aa-46e5-8582-148d4b71ea5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af9f0abe-28aa-46e5-8582-148d4b71ea5b" (UID: "af9f0abe-28aa-46e5-8582-148d4b71ea5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.485921 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9f0abe-28aa-46e5-8582-148d4b71ea5b-kube-api-access-zt2m8" (OuterVolumeSpecName: "kube-api-access-zt2m8") pod "af9f0abe-28aa-46e5-8582-148d4b71ea5b" (UID: "af9f0abe-28aa-46e5-8582-148d4b71ea5b"). InnerVolumeSpecName "kube-api-access-zt2m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.581254 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.581302 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.581311 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f0abe-28aa-46e5-8582-148d4b71ea5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.581320 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt2m8\" (UniqueName: \"kubernetes.io/projected/af9f0abe-28aa-46e5-8582-148d4b71ea5b-kube-api-access-zt2m8\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:20 crc kubenswrapper[4792]: I0318 15:40:20.581332 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f0abe-28aa-46e5-8582-148d4b71ea5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.172386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" event={"ID":"af9f0abe-28aa-46e5-8582-148d4b71ea5b","Type":"ContainerDied","Data":"11caf470b85f46810f557f37924a695dd9483f5eb4f4b812274bf43a0929f5c0"} Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.172454 4792 scope.go:117] "RemoveContainer" containerID="5f1aaccdaf7b0a89a0dacaba00f9dcdc3759c195af8e181d1b51c36c6ca30ca4" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.172452 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6477764b84-8j4z9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.204099 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-8j4z9"] Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.207778 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-8j4z9"] Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.552420 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-jjhx9"] Mar 18 15:40:21 crc kubenswrapper[4792]: E0318 15:40:21.552670 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9f0abe-28aa-46e5-8582-148d4b71ea5b" containerName="controller-manager" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.552684 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9f0abe-28aa-46e5-8582-148d4b71ea5b" containerName="controller-manager" Mar 18 15:40:21 crc kubenswrapper[4792]: E0318 15:40:21.552707 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03d9dab-5256-4887-8fed-296b586402c3" containerName="oc" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.552714 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03d9dab-5256-4887-8fed-296b586402c3" containerName="oc" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.552820 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9f0abe-28aa-46e5-8582-148d4b71ea5b" containerName="controller-manager" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.552839 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03d9dab-5256-4887-8fed-296b586402c3" containerName="oc" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.553278 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.556474 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.557748 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.558111 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.558755 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.559016 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.559073 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.568464 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.626702 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-jjhx9"] Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.692860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxm5\" (UniqueName: \"kubernetes.io/projected/eb163941-849c-40aa-8acd-cadfccd76467-kube-api-access-xwxm5\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.692914 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb163941-849c-40aa-8acd-cadfccd76467-serving-cert\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.693000 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-proxy-ca-bundles\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.693045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-config\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.693066 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-client-ca\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.794107 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-config\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.794429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-client-ca\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.794484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxm5\" (UniqueName: \"kubernetes.io/projected/eb163941-849c-40aa-8acd-cadfccd76467-kube-api-access-xwxm5\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.794511 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb163941-849c-40aa-8acd-cadfccd76467-serving-cert\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.794529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-proxy-ca-bundles\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.795614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-client-ca\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.795854 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-config\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.796113 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-proxy-ca-bundles\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.800752 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb163941-849c-40aa-8acd-cadfccd76467-serving-cert\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.819450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxm5\" (UniqueName: \"kubernetes.io/projected/eb163941-849c-40aa-8acd-cadfccd76467-kube-api-access-xwxm5\") pod \"controller-manager-584b64974b-jjhx9\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.860196 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9f0abe-28aa-46e5-8582-148d4b71ea5b" path="/var/lib/kubelet/pods/af9f0abe-28aa-46e5-8582-148d4b71ea5b/volumes" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.868479 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:21 crc kubenswrapper[4792]: I0318 15:40:21.933619 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 15:40:22 crc kubenswrapper[4792]: I0318 15:40:22.345268 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-jjhx9"] Mar 18 15:40:22 crc kubenswrapper[4792]: W0318 15:40:22.350232 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb163941_849c_40aa_8acd_cadfccd76467.slice/crio-d8801266cd98e428e2596aec213206482daa704c1e539d60c6ef448880effc86 WatchSource:0}: Error finding container d8801266cd98e428e2596aec213206482daa704c1e539d60c6ef448880effc86: Status 404 returned error can't find the container with id d8801266cd98e428e2596aec213206482daa704c1e539d60c6ef448880effc86 Mar 18 15:40:23 crc kubenswrapper[4792]: I0318 15:40:23.183804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" event={"ID":"eb163941-849c-40aa-8acd-cadfccd76467","Type":"ContainerStarted","Data":"0a638aa88ccd2eb712ced108b48617c5a67d3bc3e4aeadba1f6a3ffbd2f1d830"} Mar 18 15:40:23 crc kubenswrapper[4792]: I0318 15:40:23.184205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" event={"ID":"eb163941-849c-40aa-8acd-cadfccd76467","Type":"ContainerStarted","Data":"d8801266cd98e428e2596aec213206482daa704c1e539d60c6ef448880effc86"} Mar 18 15:40:23 crc kubenswrapper[4792]: I0318 15:40:23.184228 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:23 crc kubenswrapper[4792]: I0318 15:40:23.189345 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:23 crc kubenswrapper[4792]: I0318 15:40:23.209094 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" podStartSLOduration=4.209071374 podStartE2EDuration="4.209071374s" podCreationTimestamp="2026-03-18 15:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:23.206546695 +0000 UTC m=+372.075875642" watchObservedRunningTime="2026-03-18 15:40:23.209071374 +0000 UTC m=+372.078400311" Mar 18 15:40:28 crc kubenswrapper[4792]: I0318 15:40:28.364561 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 15:40:30 crc kubenswrapper[4792]: I0318 15:40:30.321548 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:40:30 crc kubenswrapper[4792]: I0318 15:40:30.321599 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:40:39 crc kubenswrapper[4792]: I0318 15:40:39.831591 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-jjhx9"] Mar 18 15:40:39 crc kubenswrapper[4792]: I0318 15:40:39.832270 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" podUID="eb163941-849c-40aa-8acd-cadfccd76467" containerName="controller-manager" containerID="cri-o://0a638aa88ccd2eb712ced108b48617c5a67d3bc3e4aeadba1f6a3ffbd2f1d830" gracePeriod=30 Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.272139 4792 generic.go:334] "Generic (PLEG): container finished" podID="eb163941-849c-40aa-8acd-cadfccd76467" containerID="0a638aa88ccd2eb712ced108b48617c5a67d3bc3e4aeadba1f6a3ffbd2f1d830" exitCode=0 Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.272223 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" event={"ID":"eb163941-849c-40aa-8acd-cadfccd76467","Type":"ContainerDied","Data":"0a638aa88ccd2eb712ced108b48617c5a67d3bc3e4aeadba1f6a3ffbd2f1d830"} Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.391256 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.413722 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-config\") pod \"eb163941-849c-40aa-8acd-cadfccd76467\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.413805 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-client-ca\") pod \"eb163941-849c-40aa-8acd-cadfccd76467\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.413880 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwxm5\" (UniqueName: \"kubernetes.io/projected/eb163941-849c-40aa-8acd-cadfccd76467-kube-api-access-xwxm5\") pod \"eb163941-849c-40aa-8acd-cadfccd76467\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.413903 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb163941-849c-40aa-8acd-cadfccd76467-serving-cert\") pod \"eb163941-849c-40aa-8acd-cadfccd76467\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.413953 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-proxy-ca-bundles\") pod \"eb163941-849c-40aa-8acd-cadfccd76467\" (UID: \"eb163941-849c-40aa-8acd-cadfccd76467\") " Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.415132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eb163941-849c-40aa-8acd-cadfccd76467" (UID: "eb163941-849c-40aa-8acd-cadfccd76467"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.415245 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb163941-849c-40aa-8acd-cadfccd76467" (UID: "eb163941-849c-40aa-8acd-cadfccd76467"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.415492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-config" (OuterVolumeSpecName: "config") pod "eb163941-849c-40aa-8acd-cadfccd76467" (UID: "eb163941-849c-40aa-8acd-cadfccd76467"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.420366 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb163941-849c-40aa-8acd-cadfccd76467-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb163941-849c-40aa-8acd-cadfccd76467" (UID: "eb163941-849c-40aa-8acd-cadfccd76467"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.420845 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb163941-849c-40aa-8acd-cadfccd76467-kube-api-access-xwxm5" (OuterVolumeSpecName: "kube-api-access-xwxm5") pod "eb163941-849c-40aa-8acd-cadfccd76467" (UID: "eb163941-849c-40aa-8acd-cadfccd76467"). InnerVolumeSpecName "kube-api-access-xwxm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.514937 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.514964 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwxm5\" (UniqueName: \"kubernetes.io/projected/eb163941-849c-40aa-8acd-cadfccd76467-kube-api-access-xwxm5\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.514993 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb163941-849c-40aa-8acd-cadfccd76467-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.515001 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:40 crc kubenswrapper[4792]: I0318 15:40:40.515008 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb163941-849c-40aa-8acd-cadfccd76467-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.279694 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" event={"ID":"eb163941-849c-40aa-8acd-cadfccd76467","Type":"ContainerDied","Data":"d8801266cd98e428e2596aec213206482daa704c1e539d60c6ef448880effc86"} Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.279808 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-jjhx9" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.280407 4792 scope.go:117] "RemoveContainer" containerID="0a638aa88ccd2eb712ced108b48617c5a67d3bc3e4aeadba1f6a3ffbd2f1d830" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.328003 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-jjhx9"] Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.331294 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-jjhx9"] Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.561671 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-dhhrv"] Mar 18 15:40:41 crc kubenswrapper[4792]: E0318 15:40:41.561893 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb163941-849c-40aa-8acd-cadfccd76467" containerName="controller-manager" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.561906 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb163941-849c-40aa-8acd-cadfccd76467" containerName="controller-manager" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.562051 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb163941-849c-40aa-8acd-cadfccd76467" containerName="controller-manager" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.562425 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.576568 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.576685 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.576731 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.577184 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.577338 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.577358 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.583229 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.589726 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-dhhrv"] Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.627227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-client-ca\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.627369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-proxy-ca-bundles\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.627412 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-config\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.627470 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18572155-5ab2-4ee2-bda9-3bd91f07b526-serving-cert\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.627526 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdh69\" (UniqueName: \"kubernetes.io/projected/18572155-5ab2-4ee2-bda9-3bd91f07b526-kube-api-access-jdh69\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.728783 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-config\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.728867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18572155-5ab2-4ee2-bda9-3bd91f07b526-serving-cert\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.728898 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdh69\" (UniqueName: \"kubernetes.io/projected/18572155-5ab2-4ee2-bda9-3bd91f07b526-kube-api-access-jdh69\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.728949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-client-ca\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.729027 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-proxy-ca-bundles\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.730368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-config\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.730551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-client-ca\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.730889 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18572155-5ab2-4ee2-bda9-3bd91f07b526-proxy-ca-bundles\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.735635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18572155-5ab2-4ee2-bda9-3bd91f07b526-serving-cert\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.745989 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdh69\" (UniqueName: \"kubernetes.io/projected/18572155-5ab2-4ee2-bda9-3bd91f07b526-kube-api-access-jdh69\") pod \"controller-manager-6477764b84-dhhrv\" (UID: \"18572155-5ab2-4ee2-bda9-3bd91f07b526\") " pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.861214 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb163941-849c-40aa-8acd-cadfccd76467" path="/var/lib/kubelet/pods/eb163941-849c-40aa-8acd-cadfccd76467/volumes" Mar 18 15:40:41 crc kubenswrapper[4792]: I0318 15:40:41.899516 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:42 crc kubenswrapper[4792]: I0318 15:40:42.288303 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6477764b84-dhhrv"] Mar 18 15:40:42 crc kubenswrapper[4792]: W0318 15:40:42.294221 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18572155_5ab2_4ee2_bda9_3bd91f07b526.slice/crio-4b16eea2a047a7e91bbe936effec09222728f7d8960e019c74c03525f13b78c4 WatchSource:0}: Error finding container 4b16eea2a047a7e91bbe936effec09222728f7d8960e019c74c03525f13b78c4: Status 404 returned error can't find the container with id 4b16eea2a047a7e91bbe936effec09222728f7d8960e019c74c03525f13b78c4 Mar 18 15:40:43 crc kubenswrapper[4792]: I0318 15:40:43.295031 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" event={"ID":"18572155-5ab2-4ee2-bda9-3bd91f07b526","Type":"ContainerStarted","Data":"7e87a275953afd7ac06d802f094c79c4894a77a0b2695c2b6a202ba59e9db1d8"} Mar 18 15:40:43 crc kubenswrapper[4792]: I0318 15:40:43.295094 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" event={"ID":"18572155-5ab2-4ee2-bda9-3bd91f07b526","Type":"ContainerStarted","Data":"4b16eea2a047a7e91bbe936effec09222728f7d8960e019c74c03525f13b78c4"} Mar 18 15:40:43 crc kubenswrapper[4792]: I0318 15:40:43.296030 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:43 crc kubenswrapper[4792]: I0318 15:40:43.299491 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 15:40:43 crc kubenswrapper[4792]: I0318 15:40:43.313518 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podStartSLOduration=4.313500334 podStartE2EDuration="4.313500334s" podCreationTimestamp="2026-03-18 15:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:40:43.312087336 +0000 UTC m=+392.181416263" watchObservedRunningTime="2026-03-18 15:40:43.313500334 +0000 UTC m=+392.182829271" Mar 18 15:41:00 crc kubenswrapper[4792]: I0318 15:41:00.321932 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:41:00 crc kubenswrapper[4792]: I0318 15:41:00.323149 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.503235 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67xp4"] Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.504287 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.538718 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67xp4"] Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.693874 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.694180 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-registry-tls\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.694199 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7387cb32-a912-4f5d-8ec2-7856bec087ac-trusted-ca\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.694222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7387cb32-a912-4f5d-8ec2-7856bec087ac-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.694253 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7387cb32-a912-4f5d-8ec2-7856bec087ac-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.694271 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7387cb32-a912-4f5d-8ec2-7856bec087ac-registry-certificates\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.694288 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmq2v\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-kube-api-access-xmq2v\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.694350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-bound-sa-token\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.712105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.795689 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7387cb32-a912-4f5d-8ec2-7856bec087ac-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.795749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7387cb32-a912-4f5d-8ec2-7856bec087ac-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.795774 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7387cb32-a912-4f5d-8ec2-7856bec087ac-registry-certificates\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.795791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmq2v\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-kube-api-access-xmq2v\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.795822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-bound-sa-token\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.795862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-registry-tls\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.795875 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7387cb32-a912-4f5d-8ec2-7856bec087ac-trusted-ca\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.796227 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7387cb32-a912-4f5d-8ec2-7856bec087ac-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.797834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7387cb32-a912-4f5d-8ec2-7856bec087ac-trusted-ca\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.798092 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7387cb32-a912-4f5d-8ec2-7856bec087ac-registry-certificates\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.809087 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-registry-tls\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.809342 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7387cb32-a912-4f5d-8ec2-7856bec087ac-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.814852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-bound-sa-token\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.816921 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmq2v\" (UniqueName: \"kubernetes.io/projected/7387cb32-a912-4f5d-8ec2-7856bec087ac-kube-api-access-xmq2v\") pod \"image-registry-66df7c8f76-67xp4\" (UID: \"7387cb32-a912-4f5d-8ec2-7856bec087ac\") " pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:04 crc kubenswrapper[4792]: I0318 15:41:04.827995 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:05 crc kubenswrapper[4792]: I0318 15:41:05.244881 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67xp4"] Mar 18 15:41:05 crc kubenswrapper[4792]: I0318 15:41:05.425169 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" event={"ID":"7387cb32-a912-4f5d-8ec2-7856bec087ac","Type":"ContainerStarted","Data":"0a33123875981c9e7084482feb443c03d1c05a561d187a16044f59ea6329bf41"} Mar 18 15:41:05 crc kubenswrapper[4792]: I0318 15:41:05.425251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" event={"ID":"7387cb32-a912-4f5d-8ec2-7856bec087ac","Type":"ContainerStarted","Data":"c77b3d09b7097aae070128e1c576d35bf63d5a6eefa4e1a1edd1cab5e31db9bb"} Mar 18 15:41:05 crc kubenswrapper[4792]: I0318 15:41:05.425506 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:05 crc kubenswrapper[4792]: I0318 15:41:05.445957 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" podStartSLOduration=1.4459359649999999 podStartE2EDuration="1.445935965s" podCreationTimestamp="2026-03-18 15:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:05.444062964 +0000 UTC m=+414.313391911" watchObservedRunningTime="2026-03-18 15:41:05.445935965 +0000 UTC m=+414.315264902" Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.909155 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j66mt"] Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.910063 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j66mt" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="registry-server" containerID="cri-o://028a4922a10beb204c15f0348a606d3831a61fd242759de68259eb9a733bb9cf" gracePeriod=30 Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.924135 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jqrw8"] Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.924425 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jqrw8" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="registry-server" containerID="cri-o://0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8" gracePeriod=30 Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.926923 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l8284"] Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.927176 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" containerID="cri-o://d881a5f5e0d26a03d4713efdd4311f25a26313e4fe7fd5490342ff2d315f0a49" gracePeriod=30 Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.931169 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nztp5"] Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.931394 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nztp5" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="registry-server" containerID="cri-o://6d3a398b05dbb050f7878eb54da664aba06876caf1436dcd8ade69cfb089bc17" gracePeriod=30 Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.951494 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqkxp"] Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.952214 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.962067 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c58jv"] Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.962495 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c58jv" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="registry-server" containerID="cri-o://d2a5963051d4f363a93bda67ae80995f82f4b245b7ab6f9901641ec6b4746663" gracePeriod=30 Mar 18 15:41:23 crc kubenswrapper[4792]: I0318 15:41:23.967990 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqkxp"] Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.041780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71255010-a6ae-4abf-88f1-f6c61c416ca1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.041866 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8q6w\" (UniqueName: \"kubernetes.io/projected/71255010-a6ae-4abf-88f1-f6c61c416ca1-kube-api-access-v8q6w\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.041930 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/71255010-a6ae-4abf-88f1-f6c61c416ca1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.144269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/71255010-a6ae-4abf-88f1-f6c61c416ca1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.144324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71255010-a6ae-4abf-88f1-f6c61c416ca1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.144374 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8q6w\" (UniqueName: \"kubernetes.io/projected/71255010-a6ae-4abf-88f1-f6c61c416ca1-kube-api-access-v8q6w\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.146788 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71255010-a6ae-4abf-88f1-f6c61c416ca1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.166699 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/71255010-a6ae-4abf-88f1-f6c61c416ca1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.174786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8q6w\" (UniqueName: \"kubernetes.io/projected/71255010-a6ae-4abf-88f1-f6c61c416ca1-kube-api-access-v8q6w\") pod \"marketplace-operator-79b997595-sqkxp\" (UID: \"71255010-a6ae-4abf-88f1-f6c61c416ca1\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.276924 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.540551 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerID="d881a5f5e0d26a03d4713efdd4311f25a26313e4fe7fd5490342ff2d315f0a49" exitCode=0 Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.540596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" event={"ID":"4a397c1a-6373-41ad-b12c-c56ff3afbff0","Type":"ContainerDied","Data":"d881a5f5e0d26a03d4713efdd4311f25a26313e4fe7fd5490342ff2d315f0a49"} Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.540909 4792 scope.go:117] "RemoveContainer" containerID="dd31c588e6e21b3a6e8c6806ef68af5da9a7631eae231960a048f654e3580e1d" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.543593 4792 generic.go:334] "Generic (PLEG): container finished" podID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerID="028a4922a10beb204c15f0348a606d3831a61fd242759de68259eb9a733bb9cf" exitCode=0 Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.543640 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j66mt" event={"ID":"b891667e-d9ed-4602-8ef2-b0461e32a955","Type":"ContainerDied","Data":"028a4922a10beb204c15f0348a606d3831a61fd242759de68259eb9a733bb9cf"} Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.545372 4792 generic.go:334] "Generic (PLEG): container finished" podID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerID="6d3a398b05dbb050f7878eb54da664aba06876caf1436dcd8ade69cfb089bc17" exitCode=0 Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.545422 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nztp5" event={"ID":"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6","Type":"ContainerDied","Data":"6d3a398b05dbb050f7878eb54da664aba06876caf1436dcd8ade69cfb089bc17"} Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.547044 4792 generic.go:334] "Generic (PLEG): container finished" podID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerID="0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8" exitCode=0 Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.547098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrw8" event={"ID":"8ba56c2f-0c1b-4201-9960-590b2cb73fb6","Type":"ContainerDied","Data":"0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8"} Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.549101 4792 generic.go:334] "Generic (PLEG): container finished" podID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerID="d2a5963051d4f363a93bda67ae80995f82f4b245b7ab6f9901641ec6b4746663" exitCode=0 Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.549128 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c58jv" event={"ID":"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2","Type":"ContainerDied","Data":"d2a5963051d4f363a93bda67ae80995f82f4b245b7ab6f9901641ec6b4746663"} Mar 18 15:41:24 crc kubenswrapper[4792]: E0318 15:41:24.552785 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8 is running failed: container process not found" containerID="0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:41:24 crc kubenswrapper[4792]: E0318 15:41:24.553312 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8 is running failed: container process not found" containerID="0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:41:24 crc kubenswrapper[4792]: E0318 15:41:24.554302 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8 is running failed: container process not found" containerID="0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 15:41:24 crc kubenswrapper[4792]: E0318 15:41:24.554337 4792 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-jqrw8" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="registry-server" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.694203 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqkxp"] Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.834077 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-67xp4" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.875322 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.890479 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlsn2"] Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.957087 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-catalog-content\") pod \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.957312 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-utilities\") pod \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.957488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpmfz\" (UniqueName: \"kubernetes.io/projected/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-kube-api-access-cpmfz\") pod \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\" (UID: \"8ba56c2f-0c1b-4201-9960-590b2cb73fb6\") " Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.974541 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-utilities" (OuterVolumeSpecName: "utilities") pod "8ba56c2f-0c1b-4201-9960-590b2cb73fb6" (UID: "8ba56c2f-0c1b-4201-9960-590b2cb73fb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:24 crc kubenswrapper[4792]: I0318 15:41:24.974674 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-kube-api-access-cpmfz" (OuterVolumeSpecName: "kube-api-access-cpmfz") pod "8ba56c2f-0c1b-4201-9960-590b2cb73fb6" (UID: "8ba56c2f-0c1b-4201-9960-590b2cb73fb6"). InnerVolumeSpecName "kube-api-access-cpmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.054243 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ba56c2f-0c1b-4201-9960-590b2cb73fb6" (UID: "8ba56c2f-0c1b-4201-9960-590b2cb73fb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.059105 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.059128 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpmfz\" (UniqueName: \"kubernetes.io/projected/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-kube-api-access-cpmfz\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.059137 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba56c2f-0c1b-4201-9960-590b2cb73fb6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.122848 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.128093 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.134871 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.141324 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263521 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9vck\" (UniqueName: \"kubernetes.io/projected/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-kube-api-access-j9vck\") pod \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263595 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-operator-metrics\") pod \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263630 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-utilities\") pod \"b891667e-d9ed-4602-8ef2-b0461e32a955\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263673 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-catalog-content\") pod \"b891667e-d9ed-4602-8ef2-b0461e32a955\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263701 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-trusted-ca\") pod \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-catalog-content\") pod \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-catalog-content\") pod \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263821 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-utilities\") pod \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\" (UID: \"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.263856 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7c8z\" (UniqueName: \"kubernetes.io/projected/b891667e-d9ed-4602-8ef2-b0461e32a955-kube-api-access-z7c8z\") pod \"b891667e-d9ed-4602-8ef2-b0461e32a955\" (UID: \"b891667e-d9ed-4602-8ef2-b0461e32a955\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.264725 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-utilities" (OuterVolumeSpecName: "utilities") pod "50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" (UID: "50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.266204 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsq9x\" (UniqueName: \"kubernetes.io/projected/4a397c1a-6373-41ad-b12c-c56ff3afbff0-kube-api-access-dsq9x\") pod \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\" (UID: \"4a397c1a-6373-41ad-b12c-c56ff3afbff0\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.266286 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-utilities\") pod \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.266348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4r5l\" (UniqueName: \"kubernetes.io/projected/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-kube-api-access-v4r5l\") pod \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\" (UID: \"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2\") " Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.267103 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.267243 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-utilities" (OuterVolumeSpecName: "utilities") pod "6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" (UID: "6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.268372 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-kube-api-access-j9vck" (OuterVolumeSpecName: "kube-api-access-j9vck") pod "50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" (UID: "50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6"). InnerVolumeSpecName "kube-api-access-j9vck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.268419 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4a397c1a-6373-41ad-b12c-c56ff3afbff0" (UID: "4a397c1a-6373-41ad-b12c-c56ff3afbff0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.268854 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4a397c1a-6373-41ad-b12c-c56ff3afbff0" (UID: "4a397c1a-6373-41ad-b12c-c56ff3afbff0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.269864 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b891667e-d9ed-4602-8ef2-b0461e32a955-kube-api-access-z7c8z" (OuterVolumeSpecName: "kube-api-access-z7c8z") pod "b891667e-d9ed-4602-8ef2-b0461e32a955" (UID: "b891667e-d9ed-4602-8ef2-b0461e32a955"). InnerVolumeSpecName "kube-api-access-z7c8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.270236 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a397c1a-6373-41ad-b12c-c56ff3afbff0-kube-api-access-dsq9x" (OuterVolumeSpecName: "kube-api-access-dsq9x") pod "4a397c1a-6373-41ad-b12c-c56ff3afbff0" (UID: "4a397c1a-6373-41ad-b12c-c56ff3afbff0"). InnerVolumeSpecName "kube-api-access-dsq9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.270670 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-kube-api-access-v4r5l" (OuterVolumeSpecName: "kube-api-access-v4r5l") pod "6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" (UID: "6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2"). InnerVolumeSpecName "kube-api-access-v4r5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.274773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-utilities" (OuterVolumeSpecName: "utilities") pod "b891667e-d9ed-4602-8ef2-b0461e32a955" (UID: "b891667e-d9ed-4602-8ef2-b0461e32a955"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.292663 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" (UID: "50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.319256 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b891667e-d9ed-4602-8ef2-b0461e32a955" (UID: "b891667e-d9ed-4602-8ef2-b0461e32a955"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368180 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7c8z\" (UniqueName: \"kubernetes.io/projected/b891667e-d9ed-4602-8ef2-b0461e32a955-kube-api-access-z7c8z\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368222 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsq9x\" (UniqueName: \"kubernetes.io/projected/4a397c1a-6373-41ad-b12c-c56ff3afbff0-kube-api-access-dsq9x\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368234 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368247 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4r5l\" (UniqueName: \"kubernetes.io/projected/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-kube-api-access-v4r5l\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368259 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9vck\" (UniqueName: \"kubernetes.io/projected/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-kube-api-access-j9vck\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368270 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368281 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368292 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b891667e-d9ed-4602-8ef2-b0461e32a955-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368303 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a397c1a-6373-41ad-b12c-c56ff3afbff0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.368314 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.416404 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" (UID: "6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.470193 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.557319 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nztp5" event={"ID":"50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6","Type":"ContainerDied","Data":"f87bdaee6cba6ed543cc3a8e9899d321b86161eca0e573e2a33082a4ea32fa2f"} Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.557350 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nztp5" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.557418 4792 scope.go:117] "RemoveContainer" containerID="6d3a398b05dbb050f7878eb54da664aba06876caf1436dcd8ade69cfb089bc17" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.560163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqrw8" event={"ID":"8ba56c2f-0c1b-4201-9960-590b2cb73fb6","Type":"ContainerDied","Data":"13fe3eb7d5ad330da2daafe7dd79bdd05f78c0287e42f3ac9315d0ea0430b265"} Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.560198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqrw8" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.562357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c58jv" event={"ID":"6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2","Type":"ContainerDied","Data":"b329a552ef2528a52184cce25fa713b520bc6cf09fdf1e30e2d988cf39536ad8"} Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.562449 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c58jv" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.563887 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" event={"ID":"71255010-a6ae-4abf-88f1-f6c61c416ca1","Type":"ContainerStarted","Data":"3ced28e36d7e166f3e741ea296dc8ef5a890dd73282558d7006e199be57dfe12"} Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.563911 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" event={"ID":"71255010-a6ae-4abf-88f1-f6c61c416ca1","Type":"ContainerStarted","Data":"ba408ec1204c14c9ef3377bfba5397050e5dba89d19064d5e0008093a9310e36"} Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.565055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" event={"ID":"4a397c1a-6373-41ad-b12c-c56ff3afbff0","Type":"ContainerDied","Data":"d805775f41c0fe6b44d1ee7cc24419f74d141de75cfc91b6e8c13d5ed7791393"} Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.565104 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l8284" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.573960 4792 scope.go:117] "RemoveContainer" containerID="0bfc3e0edc8ee5ffd0158c6d1bbfe9471fc06d26b11a134382e90c9fed39c4d9" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.578156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j66mt" event={"ID":"b891667e-d9ed-4602-8ef2-b0461e32a955","Type":"ContainerDied","Data":"866e30f0caeabd6de1216e508da8edb23711cc06e1d2d0d13429bf59a37ddd14"} Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.578223 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j66mt" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.604727 4792 scope.go:117] "RemoveContainer" containerID="8564824b6f02c689be7f97b26755e6d3a066da0d45527d997b0477c11d5459cf" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.630655 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" podStartSLOduration=2.630605748 podStartE2EDuration="2.630605748s" podCreationTimestamp="2026-03-18 15:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:41:25.614590338 +0000 UTC m=+434.483919285" watchObservedRunningTime="2026-03-18 15:41:25.630605748 +0000 UTC m=+434.499934695" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.633711 4792 scope.go:117] "RemoveContainer" containerID="0c5592d2b4e8ae21bb7763a8f5ab2169b0b40f9b14b2f7aa9cbede06ca98fba8" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.638721 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j66mt"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.648275 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j66mt"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.658791 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l8284"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.664899 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l8284"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.673463 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c58jv"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.674593 4792 scope.go:117] "RemoveContainer" containerID="ac70cc95e56f07cc16beadff86603fdcfa363a0ac63315a4468c3dbdafad2941" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.677530 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c58jv"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.680530 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nztp5"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.687745 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nztp5"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.690824 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jqrw8"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.691117 4792 scope.go:117] "RemoveContainer" containerID="e7b351bf7f016c6783724fa4a7474b7c93820944ed6968a08feff2c498415b17" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.693805 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jqrw8"] Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.704680 4792 scope.go:117] "RemoveContainer" containerID="d2a5963051d4f363a93bda67ae80995f82f4b245b7ab6f9901641ec6b4746663" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.725609 4792 scope.go:117] "RemoveContainer" containerID="2ae787d66521464ff5636c7c87fccd4928a1a191fceef601a593e45d77bde2b4" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.770987 4792 scope.go:117] "RemoveContainer" containerID="e4a14016c61c520d1ec65b83d319856831e055f3fcae0fd8c808a59198ee0ebf" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.788446 4792 scope.go:117] "RemoveContainer" containerID="d881a5f5e0d26a03d4713efdd4311f25a26313e4fe7fd5490342ff2d315f0a49" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.804100 4792 scope.go:117] "RemoveContainer" containerID="028a4922a10beb204c15f0348a606d3831a61fd242759de68259eb9a733bb9cf" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.818280 4792 scope.go:117] "RemoveContainer" containerID="54bd7a9fc5f42cc9ed4bf62a38db1724e9e652f776fac91e8691d17021b70917" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.832693 4792 scope.go:117] "RemoveContainer" containerID="bdd8003c10399363c6bb26405d3eb81cb892bfedffe91f49486fc11f36500bf6" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.861147 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" path="/var/lib/kubelet/pods/4a397c1a-6373-41ad-b12c-c56ff3afbff0/volumes" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.862044 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" path="/var/lib/kubelet/pods/50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6/volumes" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.862922 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" path="/var/lib/kubelet/pods/6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2/volumes" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.863917 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" path="/var/lib/kubelet/pods/8ba56c2f-0c1b-4201-9960-590b2cb73fb6/volumes" Mar 18 15:41:25 crc kubenswrapper[4792]: I0318 15:41:25.864490 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" path="/var/lib/kubelet/pods/b891667e-d9ed-4602-8ef2-b0461e32a955/volumes" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.125367 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6lntk"] Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.126422 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.126525 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.126614 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.126693 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.126777 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.126862 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.126930 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.127021 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.127107 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.127177 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.127260 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.127347 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.127431 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.127508 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.127580 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.127656 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.127727 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.127789 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.127846 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.127901 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="extract-utilities" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.127987 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.128079 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.128183 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.129439 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.129539 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.129620 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="extract-content" Mar 18 15:41:26 crc kubenswrapper[4792]: E0318 15:41:26.129705 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.129783 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.130043 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d7e9f8-1f62-49fb-a8fc-cc6cd8be7fb6" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.130150 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be7d1e5-fbbe-495a-a2e1-0beb64dacbe2" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.131942 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.132052 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b891667e-d9ed-4602-8ef2-b0461e32a955" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.132166 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba56c2f-0c1b-4201-9960-590b2cb73fb6" containerName="registry-server" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.132244 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a397c1a-6373-41ad-b12c-c56ff3afbff0" containerName="marketplace-operator" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.133749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.138179 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.138239 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lntk"] Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.279273 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f303ba2-d191-4ad6-a474-de409ea5475b-utilities\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.279337 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5jts\" (UniqueName: \"kubernetes.io/projected/1f303ba2-d191-4ad6-a474-de409ea5475b-kube-api-access-d5jts\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.279366 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f303ba2-d191-4ad6-a474-de409ea5475b-catalog-content\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.324662 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kzm75"] Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.327643 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.329510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.332321 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kzm75"] Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.380330 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf562be-1a5c-41e2-9355-706b833cb56e-catalog-content\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.380380 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf562be-1a5c-41e2-9355-706b833cb56e-utilities\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.380501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f303ba2-d191-4ad6-a474-de409ea5475b-utilities\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.380556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5jts\" (UniqueName: \"kubernetes.io/projected/1f303ba2-d191-4ad6-a474-de409ea5475b-kube-api-access-d5jts\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.380576 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2npn\" (UniqueName: \"kubernetes.io/projected/dcf562be-1a5c-41e2-9355-706b833cb56e-kube-api-access-t2npn\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.380606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f303ba2-d191-4ad6-a474-de409ea5475b-catalog-content\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.381028 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f303ba2-d191-4ad6-a474-de409ea5475b-catalog-content\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.381063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f303ba2-d191-4ad6-a474-de409ea5475b-utilities\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.419813 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5jts\" (UniqueName: \"kubernetes.io/projected/1f303ba2-d191-4ad6-a474-de409ea5475b-kube-api-access-d5jts\") pod \"redhat-marketplace-6lntk\" (UID: \"1f303ba2-d191-4ad6-a474-de409ea5475b\") " pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.451241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.483560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2npn\" (UniqueName: \"kubernetes.io/projected/dcf562be-1a5c-41e2-9355-706b833cb56e-kube-api-access-t2npn\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.483646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf562be-1a5c-41e2-9355-706b833cb56e-catalog-content\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.483665 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf562be-1a5c-41e2-9355-706b833cb56e-utilities\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.484434 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf562be-1a5c-41e2-9355-706b833cb56e-utilities\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.484665 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf562be-1a5c-41e2-9355-706b833cb56e-catalog-content\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.500743 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2npn\" (UniqueName: \"kubernetes.io/projected/dcf562be-1a5c-41e2-9355-706b833cb56e-kube-api-access-t2npn\") pod \"certified-operators-kzm75\" (UID: \"dcf562be-1a5c-41e2-9355-706b833cb56e\") " pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.587124 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.593377 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.658619 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:26 crc kubenswrapper[4792]: I0318 15:41:26.831807 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6lntk"] Mar 18 15:41:27 crc kubenswrapper[4792]: W0318 15:41:27.055565 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf562be_1a5c_41e2_9355_706b833cb56e.slice/crio-1cb7cb87be19bad8d35f96031853f5a90f8e86f65aa301600e677e76583398dc WatchSource:0}: Error finding container 1cb7cb87be19bad8d35f96031853f5a90f8e86f65aa301600e677e76583398dc: Status 404 returned error can't find the container with id 1cb7cb87be19bad8d35f96031853f5a90f8e86f65aa301600e677e76583398dc Mar 18 15:41:27 crc kubenswrapper[4792]: I0318 15:41:27.056802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kzm75"] Mar 18 15:41:27 crc kubenswrapper[4792]: I0318 15:41:27.592193 4792 generic.go:334] "Generic (PLEG): container finished" podID="dcf562be-1a5c-41e2-9355-706b833cb56e" containerID="261e835f00a767339c1fc03593e73c799942380ea910976eb0ebf8f732f8eeb5" exitCode=0 Mar 18 15:41:27 crc kubenswrapper[4792]: I0318 15:41:27.592283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzm75" event={"ID":"dcf562be-1a5c-41e2-9355-706b833cb56e","Type":"ContainerDied","Data":"261e835f00a767339c1fc03593e73c799942380ea910976eb0ebf8f732f8eeb5"} Mar 18 15:41:27 crc kubenswrapper[4792]: I0318 15:41:27.592349 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzm75" event={"ID":"dcf562be-1a5c-41e2-9355-706b833cb56e","Type":"ContainerStarted","Data":"1cb7cb87be19bad8d35f96031853f5a90f8e86f65aa301600e677e76583398dc"} Mar 18 15:41:27 crc kubenswrapper[4792]: I0318 15:41:27.595658 4792 generic.go:334] "Generic (PLEG): container finished" podID="1f303ba2-d191-4ad6-a474-de409ea5475b" containerID="f1ab5df7a2257fc1bc783e179949aa085ad3763b6efa9d6208337075a0263ff9" exitCode=0 Mar 18 15:41:27 crc kubenswrapper[4792]: I0318 15:41:27.595737 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lntk" event={"ID":"1f303ba2-d191-4ad6-a474-de409ea5475b","Type":"ContainerDied","Data":"f1ab5df7a2257fc1bc783e179949aa085ad3763b6efa9d6208337075a0263ff9"} Mar 18 15:41:27 crc kubenswrapper[4792]: I0318 15:41:27.595795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lntk" event={"ID":"1f303ba2-d191-4ad6-a474-de409ea5475b","Type":"ContainerStarted","Data":"dde449061f5dec8ef47cb80a2f563934035bf60222d8e21006fe57915705255b"} Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.520707 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zw8n7"] Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.521803 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.524387 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.533407 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zw8n7"] Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.602882 4792 generic.go:334] "Generic (PLEG): container finished" podID="dcf562be-1a5c-41e2-9355-706b833cb56e" containerID="5d19d4e65c507804b4c501fbdc2207c3a2867d0024f22aae4a0e4aebeeccc478" exitCode=0 Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.603035 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzm75" event={"ID":"dcf562be-1a5c-41e2-9355-706b833cb56e","Type":"ContainerDied","Data":"5d19d4e65c507804b4c501fbdc2207c3a2867d0024f22aae4a0e4aebeeccc478"} Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.609418 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-utilities\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.609465 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnqjv\" (UniqueName: \"kubernetes.io/projected/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-kube-api-access-mnqjv\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.609514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-catalog-content\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.710282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-catalog-content\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.710447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-utilities\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.710497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnqjv\" (UniqueName: \"kubernetes.io/projected/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-kube-api-access-mnqjv\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.712616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-catalog-content\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.713063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-utilities\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.723299 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sxbbw"] Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.724458 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.730017 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.732766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnqjv\" (UniqueName: \"kubernetes.io/projected/e5ef8d1c-3435-4dcb-8397-2314c8795c3b-kube-api-access-mnqjv\") pod \"community-operators-zw8n7\" (UID: \"e5ef8d1c-3435-4dcb-8397-2314c8795c3b\") " pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.736492 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxbbw"] Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.811914 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-utilities\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.812087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-catalog-content\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.812135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m9tt\" (UniqueName: \"kubernetes.io/projected/283429bb-089d-4683-a2a9-581e26af8a6a-kube-api-access-2m9tt\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.839398 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.912903 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m9tt\" (UniqueName: \"kubernetes.io/projected/283429bb-089d-4683-a2a9-581e26af8a6a-kube-api-access-2m9tt\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.913820 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-utilities\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.914005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-utilities\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.914383 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-catalog-content\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.914960 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-catalog-content\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:28 crc kubenswrapper[4792]: I0318 15:41:28.931316 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m9tt\" (UniqueName: \"kubernetes.io/projected/283429bb-089d-4683-a2a9-581e26af8a6a-kube-api-access-2m9tt\") pod \"redhat-operators-sxbbw\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.058083 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.258025 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zw8n7"] Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.456668 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxbbw"] Mar 18 15:41:29 crc kubenswrapper[4792]: W0318 15:41:29.495151 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod283429bb_089d_4683_a2a9_581e26af8a6a.slice/crio-b44dc2909c75d4e3beb4936545624074d5a6152efbc11f62182df544bc404c5e WatchSource:0}: Error finding container b44dc2909c75d4e3beb4936545624074d5a6152efbc11f62182df544bc404c5e: Status 404 returned error can't find the container with id b44dc2909c75d4e3beb4936545624074d5a6152efbc11f62182df544bc404c5e Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.609094 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerStarted","Data":"a9bb891cd1260870a7ea5683b1506cd23553ef73ce09e18168150c2bb71082bc"} Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.609146 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerStarted","Data":"b44dc2909c75d4e3beb4936545624074d5a6152efbc11f62182df544bc404c5e"} Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.611138 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzm75" event={"ID":"dcf562be-1a5c-41e2-9355-706b833cb56e","Type":"ContainerStarted","Data":"e516682e6bf00a1ed29199d19c025693639d16c6d01125a005bed76f27164666"} Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.612368 4792 generic.go:334] "Generic (PLEG): container finished" podID="e5ef8d1c-3435-4dcb-8397-2314c8795c3b" containerID="b75a103db3b1f75553ee62019cca7bc412d38f9499aa242dbd1c9f81c4be4d67" exitCode=0 Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.612443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw8n7" event={"ID":"e5ef8d1c-3435-4dcb-8397-2314c8795c3b","Type":"ContainerDied","Data":"b75a103db3b1f75553ee62019cca7bc412d38f9499aa242dbd1c9f81c4be4d67"} Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.612469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw8n7" event={"ID":"e5ef8d1c-3435-4dcb-8397-2314c8795c3b","Type":"ContainerStarted","Data":"710e5737e4abc3e571010abd1775449840787347a8e7e4c05610537f205e1e33"} Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.614595 4792 generic.go:334] "Generic (PLEG): container finished" podID="1f303ba2-d191-4ad6-a474-de409ea5475b" containerID="4ed36981fa0f8491857d2d8e06f00fbeb167bf2d3eb7724e969bd4a4d81bfd28" exitCode=0 Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.614674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lntk" event={"ID":"1f303ba2-d191-4ad6-a474-de409ea5475b","Type":"ContainerDied","Data":"4ed36981fa0f8491857d2d8e06f00fbeb167bf2d3eb7724e969bd4a4d81bfd28"} Mar 18 15:41:29 crc kubenswrapper[4792]: I0318 15:41:29.663488 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kzm75" podStartSLOduration=2.198863272 podStartE2EDuration="3.663466867s" podCreationTimestamp="2026-03-18 15:41:26 +0000 UTC" firstStartedPulling="2026-03-18 15:41:27.59506767 +0000 UTC m=+436.464396607" lastFinishedPulling="2026-03-18 15:41:29.059671265 +0000 UTC m=+437.929000202" observedRunningTime="2026-03-18 15:41:29.660559957 +0000 UTC m=+438.529888894" watchObservedRunningTime="2026-03-18 15:41:29.663466867 +0000 UTC m=+438.532795814" Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.321434 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.321688 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.321729 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.322289 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e556db138e974707e7c40c2b1e588b4b2e220ef714206286fc58191fa3277348"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.322362 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://e556db138e974707e7c40c2b1e588b4b2e220ef714206286fc58191fa3277348" gracePeriod=600 Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.621468 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="e556db138e974707e7c40c2b1e588b4b2e220ef714206286fc58191fa3277348" exitCode=0 Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.621555 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"e556db138e974707e7c40c2b1e588b4b2e220ef714206286fc58191fa3277348"} Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.621603 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"370a0610f69f90c455b9de196ad13a03d25979aee2067519bf5c4577cf759efe"} Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.621622 4792 scope.go:117] "RemoveContainer" containerID="9e007f3946191aafe73618c6f3c6590722b6a7a116be63affc04b502df9e98f8" Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.625623 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6lntk" event={"ID":"1f303ba2-d191-4ad6-a474-de409ea5475b","Type":"ContainerStarted","Data":"5cc0ff2435ae57cb21020fac43e65451aedd040d53fba3113169453c0f656b41"} Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.627306 4792 generic.go:334] "Generic (PLEG): container finished" podID="283429bb-089d-4683-a2a9-581e26af8a6a" containerID="a9bb891cd1260870a7ea5683b1506cd23553ef73ce09e18168150c2bb71082bc" exitCode=0 Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.627377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerDied","Data":"a9bb891cd1260870a7ea5683b1506cd23553ef73ce09e18168150c2bb71082bc"} Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.632024 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw8n7" event={"ID":"e5ef8d1c-3435-4dcb-8397-2314c8795c3b","Type":"ContainerStarted","Data":"25a1fe2d7ab9c084a094d1d09cf7a2e28a31a3eda066465ba5faeee488b3e964"} Mar 18 15:41:30 crc kubenswrapper[4792]: I0318 15:41:30.678868 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6lntk" podStartSLOduration=2.03141924 podStartE2EDuration="4.678849988s" podCreationTimestamp="2026-03-18 15:41:26 +0000 UTC" firstStartedPulling="2026-03-18 15:41:27.597846426 +0000 UTC m=+436.467175363" lastFinishedPulling="2026-03-18 15:41:30.245277174 +0000 UTC m=+439.114606111" observedRunningTime="2026-03-18 15:41:30.677567663 +0000 UTC m=+439.546896610" watchObservedRunningTime="2026-03-18 15:41:30.678849988 +0000 UTC m=+439.548178925" Mar 18 15:41:31 crc kubenswrapper[4792]: I0318 15:41:31.639433 4792 generic.go:334] "Generic (PLEG): container finished" podID="e5ef8d1c-3435-4dcb-8397-2314c8795c3b" containerID="25a1fe2d7ab9c084a094d1d09cf7a2e28a31a3eda066465ba5faeee488b3e964" exitCode=0 Mar 18 15:41:31 crc kubenswrapper[4792]: I0318 15:41:31.639620 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw8n7" event={"ID":"e5ef8d1c-3435-4dcb-8397-2314c8795c3b","Type":"ContainerDied","Data":"25a1fe2d7ab9c084a094d1d09cf7a2e28a31a3eda066465ba5faeee488b3e964"} Mar 18 15:41:31 crc kubenswrapper[4792]: I0318 15:41:31.648063 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerStarted","Data":"4e3f4c1a35e02ffa15d15a23f622cc4a603d0b0e539fb277895345da57f682c2"} Mar 18 15:41:32 crc kubenswrapper[4792]: I0318 15:41:32.655400 4792 generic.go:334] "Generic (PLEG): container finished" podID="283429bb-089d-4683-a2a9-581e26af8a6a" containerID="4e3f4c1a35e02ffa15d15a23f622cc4a603d0b0e539fb277895345da57f682c2" exitCode=0 Mar 18 15:41:32 crc kubenswrapper[4792]: I0318 15:41:32.655502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerDied","Data":"4e3f4c1a35e02ffa15d15a23f622cc4a603d0b0e539fb277895345da57f682c2"} Mar 18 15:41:32 crc kubenswrapper[4792]: I0318 15:41:32.660396 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw8n7" event={"ID":"e5ef8d1c-3435-4dcb-8397-2314c8795c3b","Type":"ContainerStarted","Data":"82b298e15697efff7f5e50ecb83628a82bd1d5c898fac000476ff14fb251ed1d"} Mar 18 15:41:33 crc kubenswrapper[4792]: I0318 15:41:33.668085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerStarted","Data":"0077bb29acb92306864d5545a151040ccb7524ded67ca47dc7a3f12787988ba5"} Mar 18 15:41:33 crc kubenswrapper[4792]: I0318 15:41:33.685322 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sxbbw" podStartSLOduration=3.252886673 podStartE2EDuration="5.685305122s" podCreationTimestamp="2026-03-18 15:41:28 +0000 UTC" firstStartedPulling="2026-03-18 15:41:30.630342296 +0000 UTC m=+439.499671233" lastFinishedPulling="2026-03-18 15:41:33.062760745 +0000 UTC m=+441.932089682" observedRunningTime="2026-03-18 15:41:33.683487382 +0000 UTC m=+442.552816329" watchObservedRunningTime="2026-03-18 15:41:33.685305122 +0000 UTC m=+442.554634059" Mar 18 15:41:33 crc kubenswrapper[4792]: I0318 15:41:33.687712 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zw8n7" podStartSLOduration=3.271133034 podStartE2EDuration="5.687701738s" podCreationTimestamp="2026-03-18 15:41:28 +0000 UTC" firstStartedPulling="2026-03-18 15:41:29.613426912 +0000 UTC m=+438.482755849" lastFinishedPulling="2026-03-18 15:41:32.029995616 +0000 UTC m=+440.899324553" observedRunningTime="2026-03-18 15:41:32.691134103 +0000 UTC m=+441.560463040" watchObservedRunningTime="2026-03-18 15:41:33.687701738 +0000 UTC m=+442.557030675" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.452011 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.452418 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.498030 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.659218 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.659298 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.718857 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.730222 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6lntk" Mar 18 15:41:36 crc kubenswrapper[4792]: I0318 15:41:36.767420 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kzm75" Mar 18 15:41:38 crc kubenswrapper[4792]: I0318 15:41:38.840185 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:38 crc kubenswrapper[4792]: I0318 15:41:38.842255 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:38 crc kubenswrapper[4792]: I0318 15:41:38.888193 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:39 crc kubenswrapper[4792]: I0318 15:41:39.058331 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:39 crc kubenswrapper[4792]: I0318 15:41:39.060160 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:39 crc kubenswrapper[4792]: I0318 15:41:39.745701 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zw8n7" Mar 18 15:41:40 crc kubenswrapper[4792]: I0318 15:41:40.093524 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sxbbw" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="registry-server" probeResult="failure" output=< Mar 18 15:41:40 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 15:41:40 crc kubenswrapper[4792]: > Mar 18 15:41:49 crc kubenswrapper[4792]: I0318 15:41:49.092759 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:49 crc kubenswrapper[4792]: I0318 15:41:49.128449 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 15:41:49 crc kubenswrapper[4792]: I0318 15:41:49.926914 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" podUID="6b93393b-f935-45f2-9ad1-8a119230b1fa" containerName="registry" containerID="cri-o://65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c" gracePeriod=30 Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.317617 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-trusted-ca\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508231 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b93393b-f935-45f2-9ad1-8a119230b1fa-ca-trust-extracted\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508288 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-certificates\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508461 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508492 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b93393b-f935-45f2-9ad1-8a119230b1fa-installation-pull-secrets\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508524 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-tls\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508547 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdmsx\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-kube-api-access-xdmsx\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.508587 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-bound-sa-token\") pod \"6b93393b-f935-45f2-9ad1-8a119230b1fa\" (UID: \"6b93393b-f935-45f2-9ad1-8a119230b1fa\") " Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.510367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.510577 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.515476 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-kube-api-access-xdmsx" (OuterVolumeSpecName: "kube-api-access-xdmsx") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "kube-api-access-xdmsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.515721 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.516092 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.528321 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b93393b-f935-45f2-9ad1-8a119230b1fa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.529227 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93393b-f935-45f2-9ad1-8a119230b1fa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.532099 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6b93393b-f935-45f2-9ad1-8a119230b1fa" (UID: "6b93393b-f935-45f2-9ad1-8a119230b1fa"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.609689 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.609731 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b93393b-f935-45f2-9ad1-8a119230b1fa-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.609742 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.609753 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b93393b-f935-45f2-9ad1-8a119230b1fa-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.609762 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.609769 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdmsx\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-kube-api-access-xdmsx\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.609777 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b93393b-f935-45f2-9ad1-8a119230b1fa-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.759856 4792 generic.go:334] "Generic (PLEG): container finished" podID="6b93393b-f935-45f2-9ad1-8a119230b1fa" containerID="65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c" exitCode=0 Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.759939 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.759960 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" event={"ID":"6b93393b-f935-45f2-9ad1-8a119230b1fa","Type":"ContainerDied","Data":"65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c"} Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.760459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlsn2" event={"ID":"6b93393b-f935-45f2-9ad1-8a119230b1fa","Type":"ContainerDied","Data":"4f623c90af1277cb10ffa13da58e82d92808172f38af31c78b4680848fe32f03"} Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.760483 4792 scope.go:117] "RemoveContainer" containerID="65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.777189 4792 scope.go:117] "RemoveContainer" containerID="65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c" Mar 18 15:41:50 crc kubenswrapper[4792]: E0318 15:41:50.777548 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c\": container with ID starting with 65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c not found: ID does not exist" containerID="65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.777580 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c"} err="failed to get container status \"65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c\": rpc error: code = NotFound desc = could not find container \"65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c\": container with ID starting with 65f02333fff3b1b97653088ba6ae495eef08e9d9870803f6059f3204dde0379c not found: ID does not exist" Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.789419 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlsn2"] Mar 18 15:41:50 crc kubenswrapper[4792]: I0318 15:41:50.792791 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlsn2"] Mar 18 15:41:51 crc kubenswrapper[4792]: I0318 15:41:51.861205 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b93393b-f935-45f2-9ad1-8a119230b1fa" path="/var/lib/kubelet/pods/6b93393b-f935-45f2-9ad1-8a119230b1fa/volumes" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.138841 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564142-887f6"] Mar 18 15:42:00 crc kubenswrapper[4792]: E0318 15:42:00.139867 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b93393b-f935-45f2-9ad1-8a119230b1fa" containerName="registry" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.139906 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b93393b-f935-45f2-9ad1-8a119230b1fa" containerName="registry" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.140098 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b93393b-f935-45f2-9ad1-8a119230b1fa" containerName="registry" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.140649 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-887f6" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.144795 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.144847 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.145099 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.148707 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-887f6"] Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.247264 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scwb5\" (UniqueName: \"kubernetes.io/projected/da70f823-e9c1-4847-855e-5ecd2db92e8d-kube-api-access-scwb5\") pod \"auto-csr-approver-29564142-887f6\" (UID: \"da70f823-e9c1-4847-855e-5ecd2db92e8d\") " pod="openshift-infra/auto-csr-approver-29564142-887f6" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.348540 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scwb5\" (UniqueName: \"kubernetes.io/projected/da70f823-e9c1-4847-855e-5ecd2db92e8d-kube-api-access-scwb5\") pod \"auto-csr-approver-29564142-887f6\" (UID: \"da70f823-e9c1-4847-855e-5ecd2db92e8d\") " pod="openshift-infra/auto-csr-approver-29564142-887f6" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.367855 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scwb5\" (UniqueName: \"kubernetes.io/projected/da70f823-e9c1-4847-855e-5ecd2db92e8d-kube-api-access-scwb5\") pod \"auto-csr-approver-29564142-887f6\" (UID: \"da70f823-e9c1-4847-855e-5ecd2db92e8d\") " pod="openshift-infra/auto-csr-approver-29564142-887f6" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.466537 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-887f6" Mar 18 15:42:00 crc kubenswrapper[4792]: I0318 15:42:00.879123 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-887f6"] Mar 18 15:42:01 crc kubenswrapper[4792]: I0318 15:42:01.828080 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-887f6" event={"ID":"da70f823-e9c1-4847-855e-5ecd2db92e8d","Type":"ContainerStarted","Data":"79cfb0adf3171dea5cbb4f9fb1416ea49c3b7bf643598dff147bd25c1e07c0f0"} Mar 18 15:42:02 crc kubenswrapper[4792]: I0318 15:42:02.835012 4792 generic.go:334] "Generic (PLEG): container finished" podID="da70f823-e9c1-4847-855e-5ecd2db92e8d" containerID="286cc85541ebae6e9dd07424b9158d547c2f534f5ae5d8725d31601b65cbc6bc" exitCode=0 Mar 18 15:42:02 crc kubenswrapper[4792]: I0318 15:42:02.835130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-887f6" event={"ID":"da70f823-e9c1-4847-855e-5ecd2db92e8d","Type":"ContainerDied","Data":"286cc85541ebae6e9dd07424b9158d547c2f534f5ae5d8725d31601b65cbc6bc"} Mar 18 15:42:04 crc kubenswrapper[4792]: I0318 15:42:04.117177 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-887f6" Mar 18 15:42:04 crc kubenswrapper[4792]: I0318 15:42:04.197657 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scwb5\" (UniqueName: \"kubernetes.io/projected/da70f823-e9c1-4847-855e-5ecd2db92e8d-kube-api-access-scwb5\") pod \"da70f823-e9c1-4847-855e-5ecd2db92e8d\" (UID: \"da70f823-e9c1-4847-855e-5ecd2db92e8d\") " Mar 18 15:42:04 crc kubenswrapper[4792]: I0318 15:42:04.203553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da70f823-e9c1-4847-855e-5ecd2db92e8d-kube-api-access-scwb5" (OuterVolumeSpecName: "kube-api-access-scwb5") pod "da70f823-e9c1-4847-855e-5ecd2db92e8d" (UID: "da70f823-e9c1-4847-855e-5ecd2db92e8d"). InnerVolumeSpecName "kube-api-access-scwb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:42:04 crc kubenswrapper[4792]: I0318 15:42:04.299849 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scwb5\" (UniqueName: \"kubernetes.io/projected/da70f823-e9c1-4847-855e-5ecd2db92e8d-kube-api-access-scwb5\") on node \"crc\" DevicePath \"\"" Mar 18 15:42:04 crc kubenswrapper[4792]: I0318 15:42:04.848354 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564142-887f6" event={"ID":"da70f823-e9c1-4847-855e-5ecd2db92e8d","Type":"ContainerDied","Data":"79cfb0adf3171dea5cbb4f9fb1416ea49c3b7bf643598dff147bd25c1e07c0f0"} Mar 18 15:42:04 crc kubenswrapper[4792]: I0318 15:42:04.848384 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564142-887f6" Mar 18 15:42:04 crc kubenswrapper[4792]: I0318 15:42:04.848409 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79cfb0adf3171dea5cbb4f9fb1416ea49c3b7bf643598dff147bd25c1e07c0f0" Mar 18 15:42:05 crc kubenswrapper[4792]: I0318 15:42:05.171670 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564136-wt69c"] Mar 18 15:42:05 crc kubenswrapper[4792]: I0318 15:42:05.175576 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564136-wt69c"] Mar 18 15:42:05 crc kubenswrapper[4792]: I0318 15:42:05.860290 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e094f66b-fe57-429a-b5cd-de6084a8aacb" path="/var/lib/kubelet/pods/e094f66b-fe57-429a-b5cd-de6084a8aacb/volumes" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.667300 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx"] Mar 18 15:42:17 crc kubenswrapper[4792]: E0318 15:42:17.668134 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da70f823-e9c1-4847-855e-5ecd2db92e8d" containerName="oc" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.668148 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="da70f823-e9c1-4847-855e-5ecd2db92e8d" containerName="oc" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.668268 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="da70f823-e9c1-4847-855e-5ecd2db92e8d" containerName="oc" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.668701 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.670646 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.670833 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.670945 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.670943 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.670953 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.678784 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx"] Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.766421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.766469 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7d7\" (UniqueName: \"kubernetes.io/projected/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-kube-api-access-gt7d7\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.766763 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.868450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.868511 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.868539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7d7\" (UniqueName: \"kubernetes.io/projected/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-kube-api-access-gt7d7\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.869705 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.884862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:17 crc kubenswrapper[4792]: I0318 15:42:17.888801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7d7\" (UniqueName: \"kubernetes.io/projected/d3ee8d56-2991-48b6-b0d0-a8d9bda05cac-kube-api-access-gt7d7\") pod \"cluster-monitoring-operator-6d5b84845-n6skx\" (UID: \"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:18 crc kubenswrapper[4792]: I0318 15:42:18.025174 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" Mar 18 15:42:18 crc kubenswrapper[4792]: I0318 15:42:18.408351 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx"] Mar 18 15:42:18 crc kubenswrapper[4792]: I0318 15:42:18.940772 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" event={"ID":"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac","Type":"ContainerStarted","Data":"74caa5aacfa1b6e8ddcfe04719b004ce17f8a3c5954194beccedacb1c0f60249"} Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.558150 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb"] Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.559286 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.561476 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-jbs8r" Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.562287 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.570587 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb"] Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.699569 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/171cc469-d2f4-4ca4-b841-144bb81881be-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-2jldb\" (UID: \"171cc469-d2f4-4ca4-b841-144bb81881be\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.800556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/171cc469-d2f4-4ca4-b841-144bb81881be-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-2jldb\" (UID: \"171cc469-d2f4-4ca4-b841-144bb81881be\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.808294 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/171cc469-d2f4-4ca4-b841-144bb81881be-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-2jldb\" (UID: \"171cc469-d2f4-4ca4-b841-144bb81881be\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.874292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.956614 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" event={"ID":"d3ee8d56-2991-48b6-b0d0-a8d9bda05cac","Type":"ContainerStarted","Data":"90c289442fc2a06b08b1b38090527bffef79177f818dd43a6c2913258364e320"} Mar 18 15:42:20 crc kubenswrapper[4792]: I0318 15:42:20.976586 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-n6skx" podStartSLOduration=2.371917455 podStartE2EDuration="3.976532863s" podCreationTimestamp="2026-03-18 15:42:17 +0000 UTC" firstStartedPulling="2026-03-18 15:42:18.420683902 +0000 UTC m=+487.290012839" lastFinishedPulling="2026-03-18 15:42:20.02529931 +0000 UTC m=+488.894628247" observedRunningTime="2026-03-18 15:42:20.975767807 +0000 UTC m=+489.845096774" watchObservedRunningTime="2026-03-18 15:42:20.976532863 +0000 UTC m=+489.845861820" Mar 18 15:42:21 crc kubenswrapper[4792]: I0318 15:42:21.320589 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb"] Mar 18 15:42:21 crc kubenswrapper[4792]: W0318 15:42:21.328181 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod171cc469_d2f4_4ca4_b841_144bb81881be.slice/crio-552ea90731dee50cc4493c6b16d561f53ca15f6cfb00abc481bd47264f5fab9f WatchSource:0}: Error finding container 552ea90731dee50cc4493c6b16d561f53ca15f6cfb00abc481bd47264f5fab9f: Status 404 returned error can't find the container with id 552ea90731dee50cc4493c6b16d561f53ca15f6cfb00abc481bd47264f5fab9f Mar 18 15:42:21 crc kubenswrapper[4792]: I0318 15:42:21.964535 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" event={"ID":"171cc469-d2f4-4ca4-b841-144bb81881be","Type":"ContainerStarted","Data":"552ea90731dee50cc4493c6b16d561f53ca15f6cfb00abc481bd47264f5fab9f"} Mar 18 15:42:22 crc kubenswrapper[4792]: I0318 15:42:22.971706 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" event={"ID":"171cc469-d2f4-4ca4-b841-144bb81881be","Type":"ContainerStarted","Data":"03bb699f6af10beb26f633c4d3cd5b1a536082ecfeb4fcce3ee5370856a694e4"} Mar 18 15:42:22 crc kubenswrapper[4792]: I0318 15:42:22.972058 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 15:42:22 crc kubenswrapper[4792]: I0318 15:42:22.978231 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 15:42:22 crc kubenswrapper[4792]: I0318 15:42:22.988701 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podStartSLOduration=1.7688342160000001 podStartE2EDuration="2.988683842s" podCreationTimestamp="2026-03-18 15:42:20 +0000 UTC" firstStartedPulling="2026-03-18 15:42:21.329769322 +0000 UTC m=+490.199098259" lastFinishedPulling="2026-03-18 15:42:22.549618948 +0000 UTC m=+491.418947885" observedRunningTime="2026-03-18 15:42:22.986228751 +0000 UTC m=+491.855557688" watchObservedRunningTime="2026-03-18 15:42:22.988683842 +0000 UTC m=+491.858012789" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.683031 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-spv9l"] Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.684454 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.686857 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.687240 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.687854 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-fr2dk" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.688478 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.697314 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-spv9l"] Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.737632 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5v4\" (UniqueName: \"kubernetes.io/projected/4ce61695-eea4-47b3-8c24-3e465bef1b06-kube-api-access-nn5v4\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.737706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ce61695-eea4-47b3-8c24-3e465bef1b06-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.737766 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ce61695-eea4-47b3-8c24-3e465bef1b06-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.737835 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce61695-eea4-47b3-8c24-3e465bef1b06-metrics-client-ca\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.839245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce61695-eea4-47b3-8c24-3e465bef1b06-metrics-client-ca\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.839316 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5v4\" (UniqueName: \"kubernetes.io/projected/4ce61695-eea4-47b3-8c24-3e465bef1b06-kube-api-access-nn5v4\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.839351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ce61695-eea4-47b3-8c24-3e465bef1b06-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.839385 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ce61695-eea4-47b3-8c24-3e465bef1b06-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.841038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce61695-eea4-47b3-8c24-3e465bef1b06-metrics-client-ca\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.845247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ce61695-eea4-47b3-8c24-3e465bef1b06-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.845383 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ce61695-eea4-47b3-8c24-3e465bef1b06-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:23 crc kubenswrapper[4792]: I0318 15:42:23.858233 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5v4\" (UniqueName: \"kubernetes.io/projected/4ce61695-eea4-47b3-8c24-3e465bef1b06-kube-api-access-nn5v4\") pod \"prometheus-operator-db54df47d-spv9l\" (UID: \"4ce61695-eea4-47b3-8c24-3e465bef1b06\") " pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:24 crc kubenswrapper[4792]: I0318 15:42:24.028401 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" Mar 18 15:42:24 crc kubenswrapper[4792]: I0318 15:42:24.399484 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-spv9l"] Mar 18 15:42:24 crc kubenswrapper[4792]: I0318 15:42:24.983062 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" event={"ID":"4ce61695-eea4-47b3-8c24-3e465bef1b06","Type":"ContainerStarted","Data":"59b8d487bfbad7ef2178fef07651e71d3a0814833074c2a6f2ca33a062bedffe"} Mar 18 15:42:26 crc kubenswrapper[4792]: I0318 15:42:26.995079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" event={"ID":"4ce61695-eea4-47b3-8c24-3e465bef1b06","Type":"ContainerStarted","Data":"df1213f7f6ba2e77915d39e00487e6779708bf9c1cbaa03f9998aa52015fbca7"} Mar 18 15:42:26 crc kubenswrapper[4792]: I0318 15:42:26.995485 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" event={"ID":"4ce61695-eea4-47b3-8c24-3e465bef1b06","Type":"ContainerStarted","Data":"36a55b5ab72fd06bf932c6cb34296aac8b677a924672f86d44ace8be3490934c"} Mar 18 15:42:27 crc kubenswrapper[4792]: I0318 15:42:27.010530 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-spv9l" podStartSLOduration=2.296450769 podStartE2EDuration="4.010510462s" podCreationTimestamp="2026-03-18 15:42:23 +0000 UTC" firstStartedPulling="2026-03-18 15:42:24.407037997 +0000 UTC m=+493.276366934" lastFinishedPulling="2026-03-18 15:42:26.12109769 +0000 UTC m=+494.990426627" observedRunningTime="2026-03-18 15:42:27.008501044 +0000 UTC m=+495.877829981" watchObservedRunningTime="2026-03-18 15:42:27.010510462 +0000 UTC m=+495.879839409" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.033649 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9"] Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.035019 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.037001 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.038090 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.038460 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-nckdb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.049412 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6s5vb"] Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.050544 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.052511 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-56k64" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.052819 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.053067 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.061553 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9"] Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.096651 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc"] Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.097642 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.100406 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.100557 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.100678 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.105348 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-gxc99" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-tls\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106453 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106499 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjq2q\" (UniqueName: \"kubernetes.io/projected/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-kube-api-access-sjq2q\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-root\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-wtmp\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106609 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-sys\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106626 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d74b12e7-04cc-4ccc-916e-fe5354a637aa-metrics-client-ca\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-textfile\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.106671 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-455mn\" (UniqueName: \"kubernetes.io/projected/d74b12e7-04cc-4ccc-916e-fe5354a637aa-kube-api-access-455mn\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.123283 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc"] Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213281 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-sys\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213345 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d74b12e7-04cc-4ccc-916e-fe5354a637aa-metrics-client-ca\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213376 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/311f1db4-590e-45ae-90b3-59f83da8a3b5-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-textfile\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-455mn\" (UniqueName: \"kubernetes.io/projected/d74b12e7-04cc-4ccc-916e-fe5354a637aa-kube-api-access-455mn\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213498 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-tls\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4kl9\" (UniqueName: \"kubernetes.io/projected/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-api-access-q4kl9\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/311f1db4-590e-45ae-90b3-59f83da8a3b5-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213661 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjq2q\" (UniqueName: \"kubernetes.io/projected/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-kube-api-access-sjq2q\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-root\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213757 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213784 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.213814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-wtmp\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.214046 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-wtmp\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.214098 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-sys\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.214872 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d74b12e7-04cc-4ccc-916e-fe5354a637aa-metrics-client-ca\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.215206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-textfile\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.215603 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d74b12e7-04cc-4ccc-916e-fe5354a637aa-root\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: E0318 15:42:29.215701 4792 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 18 15:42:29 crc kubenswrapper[4792]: E0318 15:42:29.215755 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-tls podName:f83a3eb5-12b3-452b-8c44-fa71ec64a8b1 nodeName:}" failed. No retries permitted until 2026-03-18 15:42:29.715738061 +0000 UTC m=+498.585067078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-5qqq9" (UID: "f83a3eb5-12b3-452b-8c44-fa71ec64a8b1") : secret "openshift-state-metrics-tls" not found Mar 18 15:42:29 crc kubenswrapper[4792]: E0318 15:42:29.215905 4792 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 18 15:42:29 crc kubenswrapper[4792]: E0318 15:42:29.216097 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-tls podName:d74b12e7-04cc-4ccc-916e-fe5354a637aa nodeName:}" failed. No retries permitted until 2026-03-18 15:42:29.716068712 +0000 UTC m=+498.585397729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-tls") pod "node-exporter-6s5vb" (UID: "d74b12e7-04cc-4ccc-916e-fe5354a637aa") : secret "node-exporter-tls" not found Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.216862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.223037 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.223695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.232019 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-455mn\" (UniqueName: \"kubernetes.io/projected/d74b12e7-04cc-4ccc-916e-fe5354a637aa-kube-api-access-455mn\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.233194 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjq2q\" (UniqueName: \"kubernetes.io/projected/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-kube-api-access-sjq2q\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.314481 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.314589 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.314640 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/311f1db4-590e-45ae-90b3-59f83da8a3b5-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.314685 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.314751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4kl9\" (UniqueName: \"kubernetes.io/projected/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-api-access-q4kl9\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.314779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/311f1db4-590e-45ae-90b3-59f83da8a3b5-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.315594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/311f1db4-590e-45ae-90b3-59f83da8a3b5-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.315630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.316128 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/311f1db4-590e-45ae-90b3-59f83da8a3b5-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.317838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.318467 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.332611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4kl9\" (UniqueName: \"kubernetes.io/projected/311f1db4-590e-45ae-90b3-59f83da8a3b5-kube-api-access-q4kl9\") pod \"kube-state-metrics-777cb5bd5d-nznsc\" (UID: \"311f1db4-590e-45ae-90b3-59f83da8a3b5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.429411 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.719247 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-tls\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.719360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.723862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f83a3eb5-12b3-452b-8c44-fa71ec64a8b1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5qqq9\" (UID: \"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.723877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d74b12e7-04cc-4ccc-916e-fe5354a637aa-node-exporter-tls\") pod \"node-exporter-6s5vb\" (UID: \"d74b12e7-04cc-4ccc-916e-fe5354a637aa\") " pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.821419 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc"] Mar 18 15:42:29 crc kubenswrapper[4792]: W0318 15:42:29.827434 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod311f1db4_590e_45ae_90b3_59f83da8a3b5.slice/crio-a172d777016850950448691653003d169adda1b8d048d2e7df7ab53750303caa WatchSource:0}: Error finding container a172d777016850950448691653003d169adda1b8d048d2e7df7ab53750303caa: Status 404 returned error can't find the container with id a172d777016850950448691653003d169adda1b8d048d2e7df7ab53750303caa Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.950302 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" Mar 18 15:42:29 crc kubenswrapper[4792]: I0318 15:42:29.974731 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6s5vb" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.018554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6s5vb" event={"ID":"d74b12e7-04cc-4ccc-916e-fe5354a637aa","Type":"ContainerStarted","Data":"fb43436d5564a040a191be6caa354d1ae457218f74b67d7ba2288d14d715453b"} Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.025128 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" event={"ID":"311f1db4-590e-45ae-90b3-59f83da8a3b5","Type":"ContainerStarted","Data":"a172d777016850950448691653003d169adda1b8d048d2e7df7ab53750303caa"} Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.095342 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.097326 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.104469 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.104715 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.105139 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.105284 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.108350 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.108519 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.108522 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-g9frc" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.108894 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.109306 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.110577 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225166 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1496d8a-d7a4-446e-b533-db083f3fd890-config-out\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225262 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8x24\" (UniqueName: \"kubernetes.io/projected/c1496d8a-d7a4-446e-b533-db083f3fd890-kube-api-access-d8x24\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225327 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225418 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1496d8a-d7a4-446e-b533-db083f3fd890-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1496d8a-d7a4-446e-b533-db083f3fd890-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-config-volume\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225578 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1496d8a-d7a4-446e-b533-db083f3fd890-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225599 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c1496d8a-d7a4-446e-b533-db083f3fd890-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.225624 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-web-config\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-config-volume\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1496d8a-d7a4-446e-b533-db083f3fd890-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c1496d8a-d7a4-446e-b533-db083f3fd890-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326670 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-web-config\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1496d8a-d7a4-446e-b533-db083f3fd890-config-out\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8x24\" (UniqueName: \"kubernetes.io/projected/c1496d8a-d7a4-446e-b533-db083f3fd890-kube-api-access-d8x24\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326864 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1496d8a-d7a4-446e-b533-db083f3fd890-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326888 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.326911 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1496d8a-d7a4-446e-b533-db083f3fd890-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.328546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c1496d8a-d7a4-446e-b533-db083f3fd890-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.329453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1496d8a-d7a4-446e-b533-db083f3fd890-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.330776 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1496d8a-d7a4-446e-b533-db083f3fd890-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.333346 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1496d8a-d7a4-446e-b533-db083f3fd890-config-out\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.333500 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-web-config\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.333614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1496d8a-d7a4-446e-b533-db083f3fd890-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.333955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.334254 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-config-volume\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.335594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.349572 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.349644 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c1496d8a-d7a4-446e-b533-db083f3fd890-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.351131 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8x24\" (UniqueName: \"kubernetes.io/projected/c1496d8a-d7a4-446e-b533-db083f3fd890-kube-api-access-d8x24\") pod \"alertmanager-main-0\" (UID: \"c1496d8a-d7a4-446e-b533-db083f3fd890\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.417507 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9"] Mar 18 15:42:30 crc kubenswrapper[4792]: W0318 15:42:30.421245 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83a3eb5_12b3_452b_8c44_fa71ec64a8b1.slice/crio-19dc252bad6175d65b44105ba5e0b64930769c5f1e21b1d52e9c1ce9885a7d2a WatchSource:0}: Error finding container 19dc252bad6175d65b44105ba5e0b64930769c5f1e21b1d52e9c1ce9885a7d2a: Status 404 returned error can't find the container with id 19dc252bad6175d65b44105ba5e0b64930769c5f1e21b1d52e9c1ce9885a7d2a Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.426687 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 15:42:30 crc kubenswrapper[4792]: I0318 15:42:30.850929 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 15:42:30 crc kubenswrapper[4792]: W0318 15:42:30.856433 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1496d8a_d7a4_446e_b533_db083f3fd890.slice/crio-056a7725bc90737644ff40c0727309ea3c3a389604e6f4715bd549a7c9c77683 WatchSource:0}: Error finding container 056a7725bc90737644ff40c0727309ea3c3a389604e6f4715bd549a7c9c77683: Status 404 returned error can't find the container with id 056a7725bc90737644ff40c0727309ea3c3a389604e6f4715bd549a7c9c77683 Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.008200 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5f879c84-szp77"] Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.009853 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.012953 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.013452 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.013625 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-l9hf5" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.013754 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.013877 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-57jjan2vfbhhv" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.014112 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.014243 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.025677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f879c84-szp77"] Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.052675 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" event={"ID":"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1","Type":"ContainerStarted","Data":"291f7a8db0a6bacccdac8fd86efbefa5d727772f869b3f6ed954fc13a405e310"} Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.052725 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" event={"ID":"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1","Type":"ContainerStarted","Data":"74da937e6bbe7cda072a71e363c49989621be583047db2774f57c813c22d0148"} Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.052740 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" event={"ID":"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1","Type":"ContainerStarted","Data":"19dc252bad6175d65b44105ba5e0b64930769c5f1e21b1d52e9c1ce9885a7d2a"} Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.053820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"056a7725bc90737644ff40c0727309ea3c3a389604e6f4715bd549a7c9c77683"} Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151001 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151084 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-grpc-tls\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151126 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151160 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv85n\" (UniqueName: \"kubernetes.io/projected/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-kube-api-access-nv85n\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151194 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-metrics-client-ca\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151242 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-tls\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.151288 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252660 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-metrics-client-ca\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252703 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-tls\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252766 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-grpc-tls\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.252890 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv85n\" (UniqueName: \"kubernetes.io/projected/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-kube-api-access-nv85n\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.253512 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-metrics-client-ca\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.258583 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.259607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.260822 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-tls\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.260878 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.261959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.270593 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-secret-grpc-tls\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.271319 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv85n\" (UniqueName: \"kubernetes.io/projected/a13f5c90-59fc-4b23-bf4d-0d4de34083e9-kube-api-access-nv85n\") pod \"thanos-querier-5f879c84-szp77\" (UID: \"a13f5c90-59fc-4b23-bf4d-0d4de34083e9\") " pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:31 crc kubenswrapper[4792]: I0318 15:42:31.355310 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:32 crc kubenswrapper[4792]: I0318 15:42:32.675345 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f879c84-szp77"] Mar 18 15:42:32 crc kubenswrapper[4792]: W0318 15:42:32.873883 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda13f5c90_59fc_4b23_bf4d_0d4de34083e9.slice/crio-2ecc4521925f54659fa70f2d24c5519491c7999c0b10023ae74debefe65c849b WatchSource:0}: Error finding container 2ecc4521925f54659fa70f2d24c5519491c7999c0b10023ae74debefe65c849b: Status 404 returned error can't find the container with id 2ecc4521925f54659fa70f2d24c5519491c7999c0b10023ae74debefe65c849b Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.072256 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" event={"ID":"311f1db4-590e-45ae-90b3-59f83da8a3b5","Type":"ContainerStarted","Data":"1471e1cb0bff8d8513652d9c52444b22fd16cf6f824b70517cb7a2f58a8864ba"} Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.072312 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" event={"ID":"311f1db4-590e-45ae-90b3-59f83da8a3b5","Type":"ContainerStarted","Data":"a3cc64fc80bdf1c614571a47154e3ff48e5de30af4ecdf6b828b6249f23694d4"} Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.072326 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" event={"ID":"311f1db4-590e-45ae-90b3-59f83da8a3b5","Type":"ContainerStarted","Data":"eebb6d63da08aa1d7735b51c15d9521a927999534871ce8a47d27fe49d6fdd95"} Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.076085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" event={"ID":"f83a3eb5-12b3-452b-8c44-fa71ec64a8b1","Type":"ContainerStarted","Data":"2d73edf2be6e5f01fd2c3e76974adf7ed0b7b60261efdf2ee73f04794f1518f8"} Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.078088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"48b69d789638edb6cd3f2b3298c717a4b89f429e62bfc06465da403d0ed722da"} Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.079284 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" event={"ID":"a13f5c90-59fc-4b23-bf4d-0d4de34083e9","Type":"ContainerStarted","Data":"2ecc4521925f54659fa70f2d24c5519491c7999c0b10023ae74debefe65c849b"} Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.082018 4792 generic.go:334] "Generic (PLEG): container finished" podID="d74b12e7-04cc-4ccc-916e-fe5354a637aa" containerID="616971d7d748da4ff7db1d4f64885aeb7a3ae8a823117f6e6fd7137b16f55ece" exitCode=0 Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.082078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6s5vb" event={"ID":"d74b12e7-04cc-4ccc-916e-fe5354a637aa","Type":"ContainerDied","Data":"616971d7d748da4ff7db1d4f64885aeb7a3ae8a823117f6e6fd7137b16f55ece"} Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.089739 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-nznsc" podStartSLOduration=1.698070529 podStartE2EDuration="4.089714305s" podCreationTimestamp="2026-03-18 15:42:29 +0000 UTC" firstStartedPulling="2026-03-18 15:42:29.829673362 +0000 UTC m=+498.699002299" lastFinishedPulling="2026-03-18 15:42:32.221317138 +0000 UTC m=+501.090646075" observedRunningTime="2026-03-18 15:42:33.087645567 +0000 UTC m=+501.956974504" watchObservedRunningTime="2026-03-18 15:42:33.089714305 +0000 UTC m=+501.959043242" Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.146865 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5qqq9" podStartSLOduration=2.594507411 podStartE2EDuration="4.146845662s" podCreationTimestamp="2026-03-18 15:42:29 +0000 UTC" firstStartedPulling="2026-03-18 15:42:30.763265612 +0000 UTC m=+499.632594549" lastFinishedPulling="2026-03-18 15:42:32.315603863 +0000 UTC m=+501.184932800" observedRunningTime="2026-03-18 15:42:33.146261963 +0000 UTC m=+502.015590910" watchObservedRunningTime="2026-03-18 15:42:33.146845662 +0000 UTC m=+502.016174589" Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.860744 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-57cbf6ff7f-ckxw6"] Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.863363 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:33 crc kubenswrapper[4792]: I0318 15:42:33.887477 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cbf6ff7f-ckxw6"] Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.001666 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-oauth-config\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.001776 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-serving-cert\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.002147 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-config\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.002385 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-service-ca\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.002414 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-trusted-ca-bundle\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.002504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhjmk\" (UniqueName: \"kubernetes.io/projected/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-kube-api-access-rhjmk\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.002534 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-oauth-serving-cert\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.090982 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6s5vb" event={"ID":"d74b12e7-04cc-4ccc-916e-fe5354a637aa","Type":"ContainerStarted","Data":"50b5c3ae780e74b04779c48c52258aaee6d31e5940a23eedfb1d11cc4863378a"} Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.091029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6s5vb" event={"ID":"d74b12e7-04cc-4ccc-916e-fe5354a637aa","Type":"ContainerStarted","Data":"799be1c9576c0bb67fe3b9eb0308f6fc6a784583d1fab9ae1f69c9aa23ba308d"} Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.092824 4792 generic.go:334] "Generic (PLEG): container finished" podID="c1496d8a-d7a4-446e-b533-db083f3fd890" containerID="48b69d789638edb6cd3f2b3298c717a4b89f429e62bfc06465da403d0ed722da" exitCode=0 Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.094944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerDied","Data":"48b69d789638edb6cd3f2b3298c717a4b89f429e62bfc06465da403d0ed722da"} Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.104356 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-config\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.104433 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-service-ca\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.104463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-trusted-ca-bundle\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.104506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhjmk\" (UniqueName: \"kubernetes.io/projected/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-kube-api-access-rhjmk\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.104536 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-oauth-serving-cert\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.104585 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-oauth-config\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.104603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-serving-cert\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.105578 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-config\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.105641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-service-ca\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.105806 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-trusted-ca-bundle\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.105887 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-oauth-serving-cert\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.113184 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-oauth-config\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.117598 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-serving-cert\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.142796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhjmk\" (UniqueName: \"kubernetes.io/projected/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-kube-api-access-rhjmk\") pod \"console-57cbf6ff7f-ckxw6\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.151923 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6s5vb" podStartSLOduration=2.937974069 podStartE2EDuration="5.151898494s" podCreationTimestamp="2026-03-18 15:42:29 +0000 UTC" firstStartedPulling="2026-03-18 15:42:30.003545136 +0000 UTC m=+498.872874073" lastFinishedPulling="2026-03-18 15:42:32.217469561 +0000 UTC m=+501.086798498" observedRunningTime="2026-03-18 15:42:34.112487062 +0000 UTC m=+502.981816009" watchObservedRunningTime="2026-03-18 15:42:34.151898494 +0000 UTC m=+503.021227431" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.197964 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.420210 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7db98db598-wxffp"] Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.424320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.427722 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.427731 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.427893 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.427902 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-szjkj" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.428025 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.428312 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-bgsvkobvrok43" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.428782 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7db98db598-wxffp"] Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.513363 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-client-ca-bundle\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.513473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-secret-metrics-client-certs\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.513643 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/de45de1b-91b3-41ff-9f73-95048b051745-metrics-server-audit-profiles\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.513714 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/de45de1b-91b3-41ff-9f73-95048b051745-audit-log\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.513792 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de45de1b-91b3-41ff-9f73-95048b051745-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.513875 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-secret-metrics-server-tls\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.513940 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmcdn\" (UniqueName: \"kubernetes.io/projected/de45de1b-91b3-41ff-9f73-95048b051745-kube-api-access-hmcdn\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.591051 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cbf6ff7f-ckxw6"] Mar 18 15:42:34 crc kubenswrapper[4792]: W0318 15:42:34.601527 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00de2e5a_3cdc_4b5a_93f2_4642c1b3dd8a.slice/crio-01f838b0d9f1abd17d7fcc46c43711b0ac5a81cb76a38c79556d7ad5f4b737dd WatchSource:0}: Error finding container 01f838b0d9f1abd17d7fcc46c43711b0ac5a81cb76a38c79556d7ad5f4b737dd: Status 404 returned error can't find the container with id 01f838b0d9f1abd17d7fcc46c43711b0ac5a81cb76a38c79556d7ad5f4b737dd Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.615452 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/de45de1b-91b3-41ff-9f73-95048b051745-metrics-server-audit-profiles\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.615539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/de45de1b-91b3-41ff-9f73-95048b051745-audit-log\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.615606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de45de1b-91b3-41ff-9f73-95048b051745-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.615677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-secret-metrics-server-tls\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.615724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmcdn\" (UniqueName: \"kubernetes.io/projected/de45de1b-91b3-41ff-9f73-95048b051745-kube-api-access-hmcdn\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.615792 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-client-ca-bundle\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.615863 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-secret-metrics-client-certs\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.616283 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/de45de1b-91b3-41ff-9f73-95048b051745-audit-log\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.616537 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de45de1b-91b3-41ff-9f73-95048b051745-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.616911 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/de45de1b-91b3-41ff-9f73-95048b051745-metrics-server-audit-profiles\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.621490 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-client-ca-bundle\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.621911 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-secret-metrics-server-tls\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.621950 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/de45de1b-91b3-41ff-9f73-95048b051745-secret-metrics-client-certs\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.634830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmcdn\" (UniqueName: \"kubernetes.io/projected/de45de1b-91b3-41ff-9f73-95048b051745-kube-api-access-hmcdn\") pod \"metrics-server-7db98db598-wxffp\" (UID: \"de45de1b-91b3-41ff-9f73-95048b051745\") " pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.749753 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.818122 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk"] Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.819160 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.821289 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.822254 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.834125 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk"] Mar 18 15:42:34 crc kubenswrapper[4792]: I0318 15:42:34.920753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7bb8e68c-2183-4f5a-88a3-8c274d017247-monitoring-plugin-cert\") pod \"monitoring-plugin-69764bd9c7-ntlfk\" (UID: \"7bb8e68c-2183-4f5a-88a3-8c274d017247\") " pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.021930 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7bb8e68c-2183-4f5a-88a3-8c274d017247-monitoring-plugin-cert\") pod \"monitoring-plugin-69764bd9c7-ntlfk\" (UID: \"7bb8e68c-2183-4f5a-88a3-8c274d017247\") " pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.027756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7bb8e68c-2183-4f5a-88a3-8c274d017247-monitoring-plugin-cert\") pod \"monitoring-plugin-69764bd9c7-ntlfk\" (UID: \"7bb8e68c-2183-4f5a-88a3-8c274d017247\") " pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.099356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cbf6ff7f-ckxw6" event={"ID":"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a","Type":"ContainerStarted","Data":"01f838b0d9f1abd17d7fcc46c43711b0ac5a81cb76a38c79556d7ad5f4b737dd"} Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.134548 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.187525 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7db98db598-wxffp"] Mar 18 15:42:35 crc kubenswrapper[4792]: W0318 15:42:35.199736 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde45de1b_91b3_41ff_9f73_95048b051745.slice/crio-18ce849478e0dd0e91fc76d833ae71158d4164c521a71261de4e299d0211ee0b WatchSource:0}: Error finding container 18ce849478e0dd0e91fc76d833ae71158d4164c521a71261de4e299d0211ee0b: Status 404 returned error can't find the container with id 18ce849478e0dd0e91fc76d833ae71158d4164c521a71261de4e299d0211ee0b Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.267450 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.269303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.273393 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.273524 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.273399 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.274007 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-mtkdp" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.274776 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.274898 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.275041 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.275142 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.275247 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.275584 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-eraijsvvvantq" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.276024 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.278511 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.278532 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.289824 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428259 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0261599b-51cd-4d30-8c8b-d146dc22de90-config-out\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428327 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m54zt\" (UniqueName: \"kubernetes.io/projected/0261599b-51cd-4d30-8c8b-d146dc22de90-kube-api-access-m54zt\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428545 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428611 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0261599b-51cd-4d30-8c8b-d146dc22de90-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428709 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-config\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428726 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-web-config\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428864 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.428993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.429024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530132 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530832 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530875 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0261599b-51cd-4d30-8c8b-d146dc22de90-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530918 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-config\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530937 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530957 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-web-config\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.530988 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531013 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531030 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531138 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0261599b-51cd-4d30-8c8b-d146dc22de90-config-out\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531201 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531220 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531281 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.531289 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m54zt\" (UniqueName: \"kubernetes.io/projected/0261599b-51cd-4d30-8c8b-d146dc22de90-kube-api-access-m54zt\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.532633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.532771 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.532867 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.533500 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.536310 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-web-config\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.536360 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.536555 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.536570 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-config\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.536852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.537315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0261599b-51cd-4d30-8c8b-d146dc22de90-config-out\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.537733 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.538287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.538439 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.538956 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0261599b-51cd-4d30-8c8b-d146dc22de90-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.539125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0261599b-51cd-4d30-8c8b-d146dc22de90-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.541632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0261599b-51cd-4d30-8c8b-d146dc22de90-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.556886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m54zt\" (UniqueName: \"kubernetes.io/projected/0261599b-51cd-4d30-8c8b-d146dc22de90-kube-api-access-m54zt\") pod \"prometheus-k8s-0\" (UID: \"0261599b-51cd-4d30-8c8b-d146dc22de90\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.588582 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk"] Mar 18 15:42:35 crc kubenswrapper[4792]: I0318 15:42:35.593889 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:36 crc kubenswrapper[4792]: I0318 15:42:36.113427 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" event={"ID":"7bb8e68c-2183-4f5a-88a3-8c274d017247","Type":"ContainerStarted","Data":"a23bad413c84086abad3bd9a9d11143543790fb56a500575d45784e26da86564"} Mar 18 15:42:36 crc kubenswrapper[4792]: I0318 15:42:36.115150 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cbf6ff7f-ckxw6" event={"ID":"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a","Type":"ContainerStarted","Data":"e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc"} Mar 18 15:42:36 crc kubenswrapper[4792]: I0318 15:42:36.119379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" event={"ID":"de45de1b-91b3-41ff-9f73-95048b051745","Type":"ContainerStarted","Data":"18ce849478e0dd0e91fc76d833ae71158d4164c521a71261de4e299d0211ee0b"} Mar 18 15:42:36 crc kubenswrapper[4792]: I0318 15:42:36.134596 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57cbf6ff7f-ckxw6" podStartSLOduration=3.13457593 podStartE2EDuration="3.13457593s" podCreationTimestamp="2026-03-18 15:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:42:36.130702212 +0000 UTC m=+505.000031139" watchObservedRunningTime="2026-03-18 15:42:36.13457593 +0000 UTC m=+505.003904867" Mar 18 15:42:36 crc kubenswrapper[4792]: I0318 15:42:36.517617 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 15:42:36 crc kubenswrapper[4792]: W0318 15:42:36.805248 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0261599b_51cd_4d30_8c8b_d146dc22de90.slice/crio-dc623516f22b872471d0bdbf8509b76fca33518199647450de7b8c2d611b579d WatchSource:0}: Error finding container dc623516f22b872471d0bdbf8509b76fca33518199647450de7b8c2d611b579d: Status 404 returned error can't find the container with id dc623516f22b872471d0bdbf8509b76fca33518199647450de7b8c2d611b579d Mar 18 15:42:37 crc kubenswrapper[4792]: I0318 15:42:37.127007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"5103cdd398d65cafcd40c852189d01bf3567b3833666a5e357689b7453f89ad6"} Mar 18 15:42:37 crc kubenswrapper[4792]: I0318 15:42:37.129081 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" event={"ID":"a13f5c90-59fc-4b23-bf4d-0d4de34083e9","Type":"ContainerStarted","Data":"3a085f4f509a940505cbd6bd2bfddd7782e101773a74bf2f3be9d9af7749e2c5"} Mar 18 15:42:37 crc kubenswrapper[4792]: I0318 15:42:37.130858 4792 generic.go:334] "Generic (PLEG): container finished" podID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerID="a5fb07908a7810f5c35fdbe2e446df1d1c110a44be60ce37f749f548942217ba" exitCode=0 Mar 18 15:42:37 crc kubenswrapper[4792]: I0318 15:42:37.130908 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerDied","Data":"a5fb07908a7810f5c35fdbe2e446df1d1c110a44be60ce37f749f548942217ba"} Mar 18 15:42:37 crc kubenswrapper[4792]: I0318 15:42:37.130992 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerStarted","Data":"dc623516f22b872471d0bdbf8509b76fca33518199647450de7b8c2d611b579d"} Mar 18 15:42:38 crc kubenswrapper[4792]: I0318 15:42:38.138302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"c608d5105ad98cb74dc0d69c81aebfd261bd8c787889600bcbe635ce851d92e1"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.146734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"0c12fb14f9617e57bc9f27dea4f53360c0a8e995174c5558680b2174bc335a06"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.146779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"ba79bf49ca022551af8acbd3b4ea60d2c0e50d027cfc14d3fc957163f8bc5856"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.146789 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"7062a9dd9fb31871d3d87b1e3bbfbd00d169c9b0e60336279605c4cf042e87ff"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.148589 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" event={"ID":"7bb8e68c-2183-4f5a-88a3-8c274d017247","Type":"ContainerStarted","Data":"71f21830636ac3e2e72a4cdf5703aaffeab7e7442d41ea89a696d05e3fa6c187"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.148874 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.153095 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" event={"ID":"a13f5c90-59fc-4b23-bf4d-0d4de34083e9","Type":"ContainerStarted","Data":"1605c36198022199df869d6f06c541a088e6f8e778e73a57bbbcad7f47ddc55a"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.153148 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" event={"ID":"a13f5c90-59fc-4b23-bf4d-0d4de34083e9","Type":"ContainerStarted","Data":"4d1f039c0ee70bf0b7e75b5e6ff34b2514388ab797d1cd0b7e17a6dd7c95baaa"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.154749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" event={"ID":"de45de1b-91b3-41ff-9f73-95048b051745","Type":"ContainerStarted","Data":"7dcf6c2aad91395759deaffed3cf4a619a6b8e3f5a028b7314b61ff1d295024e"} Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.158612 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.165325 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" podStartSLOduration=3.163240621 podStartE2EDuration="5.165310768s" podCreationTimestamp="2026-03-18 15:42:34 +0000 UTC" firstStartedPulling="2026-03-18 15:42:36.107136703 +0000 UTC m=+504.976465640" lastFinishedPulling="2026-03-18 15:42:38.10920685 +0000 UTC m=+506.978535787" observedRunningTime="2026-03-18 15:42:39.160130187 +0000 UTC m=+508.029459124" watchObservedRunningTime="2026-03-18 15:42:39.165310768 +0000 UTC m=+508.034639705" Mar 18 15:42:39 crc kubenswrapper[4792]: I0318 15:42:39.182289 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" podStartSLOduration=2.277696379 podStartE2EDuration="5.182252048s" podCreationTimestamp="2026-03-18 15:42:34 +0000 UTC" firstStartedPulling="2026-03-18 15:42:35.20220204 +0000 UTC m=+504.071530977" lastFinishedPulling="2026-03-18 15:42:38.106757709 +0000 UTC m=+506.976086646" observedRunningTime="2026-03-18 15:42:39.177442489 +0000 UTC m=+508.046771446" watchObservedRunningTime="2026-03-18 15:42:39.182252048 +0000 UTC m=+508.051580985" Mar 18 15:42:40 crc kubenswrapper[4792]: I0318 15:42:40.164704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1496d8a-d7a4-446e-b533-db083f3fd890","Type":"ContainerStarted","Data":"a15a8962b4be60b07d7ad173baf2faa3e1f663f1b0f67b51276fe23b4b95d438"} Mar 18 15:42:40 crc kubenswrapper[4792]: I0318 15:42:40.167267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" event={"ID":"a13f5c90-59fc-4b23-bf4d-0d4de34083e9","Type":"ContainerStarted","Data":"02e0c1cf537010982524cdc021542c44acc7251c04597b14eee42b6af146ef63"} Mar 18 15:42:40 crc kubenswrapper[4792]: I0318 15:42:40.198491 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.140853136 podStartE2EDuration="10.198466407s" podCreationTimestamp="2026-03-18 15:42:30 +0000 UTC" firstStartedPulling="2026-03-18 15:42:30.859796291 +0000 UTC m=+499.729125228" lastFinishedPulling="2026-03-18 15:42:39.917409552 +0000 UTC m=+508.786738499" observedRunningTime="2026-03-18 15:42:40.19190317 +0000 UTC m=+509.061232127" watchObservedRunningTime="2026-03-18 15:42:40.198466407 +0000 UTC m=+509.067795344" Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.183195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" event={"ID":"a13f5c90-59fc-4b23-bf4d-0d4de34083e9","Type":"ContainerStarted","Data":"9a486fec4c04267861f4f4a59629725372cb6b2967a4798cc9ca3151571d2f37"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.184093 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" event={"ID":"a13f5c90-59fc-4b23-bf4d-0d4de34083e9","Type":"ContainerStarted","Data":"5e45f36cc60b4c584ccd4b29e27b96a464c6794d020bcc115d31ed7406265576"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.184123 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.188357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerStarted","Data":"0bbd60e2e521db004b12c90399f6dde044e5624ea84fee7a34a83419c457f4c8"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.188394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerStarted","Data":"b01fd20e3f64f1cc83e4058b87fb544bf6ec242ea5f0b5e48307b3b05629c15d"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.188406 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerStarted","Data":"504140f7ee545c2090911ff8e31f1ca3e4c369c14f20b5ec17c2c88b1d4d9a45"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.188418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerStarted","Data":"9c4aa90dce8b0899bf4c7f780ea16cbea596bb6bf602635c83bf9e350757298e"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.188429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerStarted","Data":"18b5542f3e457d4528dbcc0f748cb81255ea4a5c082131aadae510348f8927cc"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.188439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0261599b-51cd-4d30-8c8b-d146dc22de90","Type":"ContainerStarted","Data":"0eb727c5169e34bd14d8dfc16d42341c9e1865fa1804951cc0a0d7af7ff4ee42"} Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.205130 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" podStartSLOduration=5.183406278 podStartE2EDuration="12.205108255s" podCreationTimestamp="2026-03-18 15:42:30 +0000 UTC" firstStartedPulling="2026-03-18 15:42:32.879829081 +0000 UTC m=+501.749158028" lastFinishedPulling="2026-03-18 15:42:39.901531068 +0000 UTC m=+508.770860005" observedRunningTime="2026-03-18 15:42:42.202050174 +0000 UTC m=+511.071379121" watchObservedRunningTime="2026-03-18 15:42:42.205108255 +0000 UTC m=+511.074437202" Mar 18 15:42:42 crc kubenswrapper[4792]: I0318 15:42:42.232260 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.235652297 podStartE2EDuration="7.232243892s" podCreationTimestamp="2026-03-18 15:42:35 +0000 UTC" firstStartedPulling="2026-03-18 15:42:37.132709182 +0000 UTC m=+506.002038119" lastFinishedPulling="2026-03-18 15:42:41.129300777 +0000 UTC m=+509.998629714" observedRunningTime="2026-03-18 15:42:42.231128085 +0000 UTC m=+511.100457042" watchObservedRunningTime="2026-03-18 15:42:42.232243892 +0000 UTC m=+511.101572829" Mar 18 15:42:43 crc kubenswrapper[4792]: I0318 15:42:43.205591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" Mar 18 15:42:44 crc kubenswrapper[4792]: I0318 15:42:44.198343 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:44 crc kubenswrapper[4792]: I0318 15:42:44.198690 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:44 crc kubenswrapper[4792]: I0318 15:42:44.202227 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:45 crc kubenswrapper[4792]: I0318 15:42:45.210173 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:42:45 crc kubenswrapper[4792]: I0318 15:42:45.256063 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gs8sw"] Mar 18 15:42:45 crc kubenswrapper[4792]: I0318 15:42:45.594239 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:42:54 crc kubenswrapper[4792]: I0318 15:42:54.751080 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:42:54 crc kubenswrapper[4792]: I0318 15:42:54.751737 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:43:10 crc kubenswrapper[4792]: I0318 15:43:10.291860 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-gs8sw" podUID="089a8476-5fcb-4378-8113-4a7162685b16" containerName="console" containerID="cri-o://2ab6db78b289f688262fd75b87e727ca0bf2093cdb8a5376f333838d668cf575" gracePeriod=15 Mar 18 15:43:11 crc kubenswrapper[4792]: I0318 15:43:11.371745 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gs8sw_089a8476-5fcb-4378-8113-4a7162685b16/console/0.log" Mar 18 15:43:11 crc kubenswrapper[4792]: I0318 15:43:11.372067 4792 generic.go:334] "Generic (PLEG): container finished" podID="089a8476-5fcb-4378-8113-4a7162685b16" containerID="2ab6db78b289f688262fd75b87e727ca0bf2093cdb8a5376f333838d668cf575" exitCode=2 Mar 18 15:43:11 crc kubenswrapper[4792]: I0318 15:43:11.372103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gs8sw" event={"ID":"089a8476-5fcb-4378-8113-4a7162685b16","Type":"ContainerDied","Data":"2ab6db78b289f688262fd75b87e727ca0bf2093cdb8a5376f333838d668cf575"} Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.540883 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gs8sw_089a8476-5fcb-4378-8113-4a7162685b16/console/0.log" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.541215 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693345 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-oauth-serving-cert\") pod \"089a8476-5fcb-4378-8113-4a7162685b16\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693406 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-service-ca\") pod \"089a8476-5fcb-4378-8113-4a7162685b16\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693447 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-trusted-ca-bundle\") pod \"089a8476-5fcb-4378-8113-4a7162685b16\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693474 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-console-config\") pod \"089a8476-5fcb-4378-8113-4a7162685b16\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693566 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-oauth-config\") pod \"089a8476-5fcb-4378-8113-4a7162685b16\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693593 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kntgf\" (UniqueName: \"kubernetes.io/projected/089a8476-5fcb-4378-8113-4a7162685b16-kube-api-access-kntgf\") pod \"089a8476-5fcb-4378-8113-4a7162685b16\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693626 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-serving-cert\") pod \"089a8476-5fcb-4378-8113-4a7162685b16\" (UID: \"089a8476-5fcb-4378-8113-4a7162685b16\") " Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693882 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-service-ca" (OuterVolumeSpecName: "service-ca") pod "089a8476-5fcb-4378-8113-4a7162685b16" (UID: "089a8476-5fcb-4378-8113-4a7162685b16"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.693916 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "089a8476-5fcb-4378-8113-4a7162685b16" (UID: "089a8476-5fcb-4378-8113-4a7162685b16"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.694116 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-console-config" (OuterVolumeSpecName: "console-config") pod "089a8476-5fcb-4378-8113-4a7162685b16" (UID: "089a8476-5fcb-4378-8113-4a7162685b16"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.694141 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "089a8476-5fcb-4378-8113-4a7162685b16" (UID: "089a8476-5fcb-4378-8113-4a7162685b16"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.694469 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.694483 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.694491 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.694500 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/089a8476-5fcb-4378-8113-4a7162685b16-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.698822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089a8476-5fcb-4378-8113-4a7162685b16-kube-api-access-kntgf" (OuterVolumeSpecName: "kube-api-access-kntgf") pod "089a8476-5fcb-4378-8113-4a7162685b16" (UID: "089a8476-5fcb-4378-8113-4a7162685b16"). InnerVolumeSpecName "kube-api-access-kntgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.703151 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "089a8476-5fcb-4378-8113-4a7162685b16" (UID: "089a8476-5fcb-4378-8113-4a7162685b16"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.703441 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "089a8476-5fcb-4378-8113-4a7162685b16" (UID: "089a8476-5fcb-4378-8113-4a7162685b16"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.796249 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kntgf\" (UniqueName: \"kubernetes.io/projected/089a8476-5fcb-4378-8113-4a7162685b16-kube-api-access-kntgf\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.796330 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:12 crc kubenswrapper[4792]: I0318 15:43:12.796345 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/089a8476-5fcb-4378-8113-4a7162685b16-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:43:13 crc kubenswrapper[4792]: I0318 15:43:13.392763 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gs8sw_089a8476-5fcb-4378-8113-4a7162685b16/console/0.log" Mar 18 15:43:13 crc kubenswrapper[4792]: I0318 15:43:13.392832 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gs8sw" event={"ID":"089a8476-5fcb-4378-8113-4a7162685b16","Type":"ContainerDied","Data":"fb5a76a2f3e408997aa0b5429402e8411660efb08b645365ee8366c65342a8c9"} Mar 18 15:43:13 crc kubenswrapper[4792]: I0318 15:43:13.392876 4792 scope.go:117] "RemoveContainer" containerID="2ab6db78b289f688262fd75b87e727ca0bf2093cdb8a5376f333838d668cf575" Mar 18 15:43:13 crc kubenswrapper[4792]: I0318 15:43:13.392900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gs8sw" Mar 18 15:43:13 crc kubenswrapper[4792]: I0318 15:43:13.423306 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gs8sw"] Mar 18 15:43:13 crc kubenswrapper[4792]: I0318 15:43:13.432838 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-gs8sw"] Mar 18 15:43:13 crc kubenswrapper[4792]: I0318 15:43:13.888936 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089a8476-5fcb-4378-8113-4a7162685b16" path="/var/lib/kubelet/pods/089a8476-5fcb-4378-8113-4a7162685b16/volumes" Mar 18 15:43:14 crc kubenswrapper[4792]: I0318 15:43:14.756584 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:43:14 crc kubenswrapper[4792]: I0318 15:43:14.762051 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" Mar 18 15:43:30 crc kubenswrapper[4792]: I0318 15:43:30.321387 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:43:30 crc kubenswrapper[4792]: I0318 15:43:30.321932 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:43:35 crc kubenswrapper[4792]: I0318 15:43:35.594505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:43:35 crc kubenswrapper[4792]: I0318 15:43:35.633715 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:43:36 crc kubenswrapper[4792]: I0318 15:43:36.588276 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.711337 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54d8f469b-rjgv6"] Mar 18 15:43:51 crc kubenswrapper[4792]: E0318 15:43:51.712794 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089a8476-5fcb-4378-8113-4a7162685b16" containerName="console" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.712816 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="089a8476-5fcb-4378-8113-4a7162685b16" containerName="console" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.713097 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="089a8476-5fcb-4378-8113-4a7162685b16" containerName="console" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.713889 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.725419 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54d8f469b-rjgv6"] Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.887325 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-config\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.887416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-service-ca\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.887442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j97w4\" (UniqueName: \"kubernetes.io/projected/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-kube-api-access-j97w4\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.887467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-oauth-config\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.887539 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-oauth-serving-cert\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.887620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-serving-cert\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.887661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-trusted-ca-bundle\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.989511 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-config\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.989753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-service-ca\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.989855 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j97w4\" (UniqueName: \"kubernetes.io/projected/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-kube-api-access-j97w4\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.989915 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-oauth-config\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.990800 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-service-ca\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.990846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-oauth-serving-cert\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.991058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-config\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.991062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-serving-cert\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.991222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-trusted-ca-bundle\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.992373 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-trusted-ca-bundle\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.992507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-oauth-serving-cert\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:51 crc kubenswrapper[4792]: I0318 15:43:51.999738 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-serving-cert\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:52 crc kubenswrapper[4792]: I0318 15:43:52.000069 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-oauth-config\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:52 crc kubenswrapper[4792]: I0318 15:43:52.010908 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j97w4\" (UniqueName: \"kubernetes.io/projected/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-kube-api-access-j97w4\") pod \"console-54d8f469b-rjgv6\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:52 crc kubenswrapper[4792]: I0318 15:43:52.037808 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:43:52 crc kubenswrapper[4792]: I0318 15:43:52.460804 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54d8f469b-rjgv6"] Mar 18 15:43:52 crc kubenswrapper[4792]: I0318 15:43:52.670100 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54d8f469b-rjgv6" event={"ID":"149e0c91-96bc-4c5d-9f3f-e2558cc912cb","Type":"ContainerStarted","Data":"145dbc6aba298088d9997334d38be5dbc21489a12a9795bb914d5bdc94322564"} Mar 18 15:43:52 crc kubenswrapper[4792]: I0318 15:43:52.670159 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54d8f469b-rjgv6" event={"ID":"149e0c91-96bc-4c5d-9f3f-e2558cc912cb","Type":"ContainerStarted","Data":"f02c5d09e58e2e8530d56bb513d73369d42f0bac9176843bc75ab48e27875341"} Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.134593 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54d8f469b-rjgv6" podStartSLOduration=9.134571243 podStartE2EDuration="9.134571243s" podCreationTimestamp="2026-03-18 15:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:43:53.696806806 +0000 UTC m=+582.566135753" watchObservedRunningTime="2026-03-18 15:44:00.134571243 +0000 UTC m=+589.003900180" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.141390 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564144-nl5nl"] Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.142288 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-nl5nl" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.144037 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.144143 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.145126 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.147489 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-nl5nl"] Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.316118 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t86j\" (UniqueName: \"kubernetes.io/projected/b441fdb2-bae8-499e-b5a5-1872dcd1f704-kube-api-access-4t86j\") pod \"auto-csr-approver-29564144-nl5nl\" (UID: \"b441fdb2-bae8-499e-b5a5-1872dcd1f704\") " pod="openshift-infra/auto-csr-approver-29564144-nl5nl" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.322001 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.322047 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.418029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t86j\" (UniqueName: \"kubernetes.io/projected/b441fdb2-bae8-499e-b5a5-1872dcd1f704-kube-api-access-4t86j\") pod \"auto-csr-approver-29564144-nl5nl\" (UID: \"b441fdb2-bae8-499e-b5a5-1872dcd1f704\") " pod="openshift-infra/auto-csr-approver-29564144-nl5nl" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.445045 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t86j\" (UniqueName: \"kubernetes.io/projected/b441fdb2-bae8-499e-b5a5-1872dcd1f704-kube-api-access-4t86j\") pod \"auto-csr-approver-29564144-nl5nl\" (UID: \"b441fdb2-bae8-499e-b5a5-1872dcd1f704\") " pod="openshift-infra/auto-csr-approver-29564144-nl5nl" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.460734 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-nl5nl" Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.651162 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-nl5nl"] Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.659878 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:44:00 crc kubenswrapper[4792]: I0318 15:44:00.719639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-nl5nl" event={"ID":"b441fdb2-bae8-499e-b5a5-1872dcd1f704","Type":"ContainerStarted","Data":"75ba9b6df514b989e0eb04b044bd2ed9108edc0a039c62ef783f162827de1e2a"} Mar 18 15:44:02 crc kubenswrapper[4792]: I0318 15:44:02.038505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:44:02 crc kubenswrapper[4792]: I0318 15:44:02.038857 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:44:02 crc kubenswrapper[4792]: I0318 15:44:02.044649 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:44:02 crc kubenswrapper[4792]: I0318 15:44:02.733739 4792 generic.go:334] "Generic (PLEG): container finished" podID="b441fdb2-bae8-499e-b5a5-1872dcd1f704" containerID="c21f868d9a176b0049aedfd41baf14e6ca159cc91748af1aebf646656ea0cab2" exitCode=0 Mar 18 15:44:02 crc kubenswrapper[4792]: I0318 15:44:02.736218 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-nl5nl" event={"ID":"b441fdb2-bae8-499e-b5a5-1872dcd1f704","Type":"ContainerDied","Data":"c21f868d9a176b0049aedfd41baf14e6ca159cc91748af1aebf646656ea0cab2"} Mar 18 15:44:02 crc kubenswrapper[4792]: I0318 15:44:02.739248 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:44:02 crc kubenswrapper[4792]: I0318 15:44:02.835128 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57cbf6ff7f-ckxw6"] Mar 18 15:44:03 crc kubenswrapper[4792]: I0318 15:44:03.973464 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-nl5nl" Mar 18 15:44:04 crc kubenswrapper[4792]: I0318 15:44:04.074880 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t86j\" (UniqueName: \"kubernetes.io/projected/b441fdb2-bae8-499e-b5a5-1872dcd1f704-kube-api-access-4t86j\") pod \"b441fdb2-bae8-499e-b5a5-1872dcd1f704\" (UID: \"b441fdb2-bae8-499e-b5a5-1872dcd1f704\") " Mar 18 15:44:04 crc kubenswrapper[4792]: I0318 15:44:04.081938 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b441fdb2-bae8-499e-b5a5-1872dcd1f704-kube-api-access-4t86j" (OuterVolumeSpecName: "kube-api-access-4t86j") pod "b441fdb2-bae8-499e-b5a5-1872dcd1f704" (UID: "b441fdb2-bae8-499e-b5a5-1872dcd1f704"). InnerVolumeSpecName "kube-api-access-4t86j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:04 crc kubenswrapper[4792]: I0318 15:44:04.177859 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t86j\" (UniqueName: \"kubernetes.io/projected/b441fdb2-bae8-499e-b5a5-1872dcd1f704-kube-api-access-4t86j\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:04 crc kubenswrapper[4792]: I0318 15:44:04.746822 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564144-nl5nl" event={"ID":"b441fdb2-bae8-499e-b5a5-1872dcd1f704","Type":"ContainerDied","Data":"75ba9b6df514b989e0eb04b044bd2ed9108edc0a039c62ef783f162827de1e2a"} Mar 18 15:44:04 crc kubenswrapper[4792]: I0318 15:44:04.747231 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75ba9b6df514b989e0eb04b044bd2ed9108edc0a039c62ef783f162827de1e2a" Mar 18 15:44:04 crc kubenswrapper[4792]: I0318 15:44:04.746916 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564144-nl5nl" Mar 18 15:44:05 crc kubenswrapper[4792]: I0318 15:44:05.034076 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-fjs6m"] Mar 18 15:44:05 crc kubenswrapper[4792]: I0318 15:44:05.039439 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564138-fjs6m"] Mar 18 15:44:05 crc kubenswrapper[4792]: I0318 15:44:05.862175 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ab1328-f29a-43da-9ed6-8a85635e8808" path="/var/lib/kubelet/pods/53ab1328-f29a-43da-9ed6-8a85635e8808/volumes" Mar 18 15:44:27 crc kubenswrapper[4792]: I0318 15:44:27.885184 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-57cbf6ff7f-ckxw6" podUID="00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" containerName="console" containerID="cri-o://e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc" gracePeriod=15 Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.279312 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57cbf6ff7f-ckxw6_00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a/console/0.log" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.279385 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.367504 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-config\") pod \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.368540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-config" (OuterVolumeSpecName: "console-config") pod "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" (UID: "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.468615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhjmk\" (UniqueName: \"kubernetes.io/projected/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-kube-api-access-rhjmk\") pod \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.468697 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-trusted-ca-bundle\") pod \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.468771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-serving-cert\") pod \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.468814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-oauth-config\") pod \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.468953 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-oauth-serving-cert\") pod \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.469035 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-service-ca\") pod \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\" (UID: \"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a\") " Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.469447 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.469576 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" (UID: "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.469857 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-service-ca" (OuterVolumeSpecName: "service-ca") pod "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" (UID: "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.470147 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" (UID: "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.474089 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-kube-api-access-rhjmk" (OuterVolumeSpecName: "kube-api-access-rhjmk") pod "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" (UID: "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a"). InnerVolumeSpecName "kube-api-access-rhjmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.474213 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" (UID: "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.475917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" (UID: "00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.570793 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.570827 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.570836 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhjmk\" (UniqueName: \"kubernetes.io/projected/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-kube-api-access-rhjmk\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.570848 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.570857 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.570865 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.904405 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57cbf6ff7f-ckxw6_00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a/console/0.log" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.904443 4792 generic.go:334] "Generic (PLEG): container finished" podID="00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" containerID="e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc" exitCode=2 Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.904467 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cbf6ff7f-ckxw6" event={"ID":"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a","Type":"ContainerDied","Data":"e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc"} Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.904489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cbf6ff7f-ckxw6" event={"ID":"00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a","Type":"ContainerDied","Data":"01f838b0d9f1abd17d7fcc46c43711b0ac5a81cb76a38c79556d7ad5f4b737dd"} Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.904505 4792 scope.go:117] "RemoveContainer" containerID="e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.904591 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cbf6ff7f-ckxw6" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.928871 4792 scope.go:117] "RemoveContainer" containerID="e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc" Mar 18 15:44:28 crc kubenswrapper[4792]: E0318 15:44:28.931563 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc\": container with ID starting with e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc not found: ID does not exist" containerID="e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.931707 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc"} err="failed to get container status \"e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc\": rpc error: code = NotFound desc = could not find container \"e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc\": container with ID starting with e36e5436f079d2346c91959f6c304bee5498234f1717159c3f3adcf1e89c4cbc not found: ID does not exist" Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.939939 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57cbf6ff7f-ckxw6"] Mar 18 15:44:28 crc kubenswrapper[4792]: I0318 15:44:28.944371 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57cbf6ff7f-ckxw6"] Mar 18 15:44:29 crc kubenswrapper[4792]: I0318 15:44:29.872815 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" path="/var/lib/kubelet/pods/00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a/volumes" Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.321519 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.321595 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.321639 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.322169 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"370a0610f69f90c455b9de196ad13a03d25979aee2067519bf5c4577cf759efe"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.322222 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://370a0610f69f90c455b9de196ad13a03d25979aee2067519bf5c4577cf759efe" gracePeriod=600 Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.919325 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="370a0610f69f90c455b9de196ad13a03d25979aee2067519bf5c4577cf759efe" exitCode=0 Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.919401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"370a0610f69f90c455b9de196ad13a03d25979aee2067519bf5c4577cf759efe"} Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.919933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"ff8ccb857e3db00243a27bea97ff36a63ee591214974ddeca1d87a9822ae051d"} Mar 18 15:44:30 crc kubenswrapper[4792]: I0318 15:44:30.919957 4792 scope.go:117] "RemoveContainer" containerID="e556db138e974707e7c40c2b1e588b4b2e220ef714206286fc58191fa3277348" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.133112 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f"] Mar 18 15:45:00 crc kubenswrapper[4792]: E0318 15:45:00.133895 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" containerName="console" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.133909 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" containerName="console" Mar 18 15:45:00 crc kubenswrapper[4792]: E0318 15:45:00.133927 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b441fdb2-bae8-499e-b5a5-1872dcd1f704" containerName="oc" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.133934 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b441fdb2-bae8-499e-b5a5-1872dcd1f704" containerName="oc" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.134080 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="00de2e5a-3cdc-4b5a-93f2-4642c1b3dd8a" containerName="console" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.134097 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b441fdb2-bae8-499e-b5a5-1872dcd1f704" containerName="oc" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.135477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.138956 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.139698 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.142138 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f"] Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.332892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f59420-9ada-4d61-a361-e8366afd90e6-secret-volume\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.333256 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f59420-9ada-4d61-a361-e8366afd90e6-config-volume\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.333343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl64t\" (UniqueName: \"kubernetes.io/projected/a2f59420-9ada-4d61-a361-e8366afd90e6-kube-api-access-cl64t\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.434795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f59420-9ada-4d61-a361-e8366afd90e6-secret-volume\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.434880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f59420-9ada-4d61-a361-e8366afd90e6-config-volume\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.435082 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl64t\" (UniqueName: \"kubernetes.io/projected/a2f59420-9ada-4d61-a361-e8366afd90e6-kube-api-access-cl64t\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.436132 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f59420-9ada-4d61-a361-e8366afd90e6-config-volume\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.445030 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f59420-9ada-4d61-a361-e8366afd90e6-secret-volume\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.451361 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl64t\" (UniqueName: \"kubernetes.io/projected/a2f59420-9ada-4d61-a361-e8366afd90e6-kube-api-access-cl64t\") pod \"collect-profiles-29564145-dbk6f\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.457014 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:00 crc kubenswrapper[4792]: I0318 15:45:00.648683 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f"] Mar 18 15:45:01 crc kubenswrapper[4792]: I0318 15:45:01.111892 4792 generic.go:334] "Generic (PLEG): container finished" podID="a2f59420-9ada-4d61-a361-e8366afd90e6" containerID="ff31ee7a894b16e3dede3cc112abed401933798a935188e84a112ce88177e08c" exitCode=0 Mar 18 15:45:01 crc kubenswrapper[4792]: I0318 15:45:01.111937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" event={"ID":"a2f59420-9ada-4d61-a361-e8366afd90e6","Type":"ContainerDied","Data":"ff31ee7a894b16e3dede3cc112abed401933798a935188e84a112ce88177e08c"} Mar 18 15:45:01 crc kubenswrapper[4792]: I0318 15:45:01.111961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" event={"ID":"a2f59420-9ada-4d61-a361-e8366afd90e6","Type":"ContainerStarted","Data":"86fef560c49872065987858c0771166979c889d07e998ea6f69d5bd8c662aeb5"} Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.323647 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.364224 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f59420-9ada-4d61-a361-e8366afd90e6-config-volume\") pod \"a2f59420-9ada-4d61-a361-e8366afd90e6\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.364427 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f59420-9ada-4d61-a361-e8366afd90e6-secret-volume\") pod \"a2f59420-9ada-4d61-a361-e8366afd90e6\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.364476 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl64t\" (UniqueName: \"kubernetes.io/projected/a2f59420-9ada-4d61-a361-e8366afd90e6-kube-api-access-cl64t\") pod \"a2f59420-9ada-4d61-a361-e8366afd90e6\" (UID: \"a2f59420-9ada-4d61-a361-e8366afd90e6\") " Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.365099 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2f59420-9ada-4d61-a361-e8366afd90e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "a2f59420-9ada-4d61-a361-e8366afd90e6" (UID: "a2f59420-9ada-4d61-a361-e8366afd90e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.369626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f59420-9ada-4d61-a361-e8366afd90e6-kube-api-access-cl64t" (OuterVolumeSpecName: "kube-api-access-cl64t") pod "a2f59420-9ada-4d61-a361-e8366afd90e6" (UID: "a2f59420-9ada-4d61-a361-e8366afd90e6"). InnerVolumeSpecName "kube-api-access-cl64t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.369786 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f59420-9ada-4d61-a361-e8366afd90e6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a2f59420-9ada-4d61-a361-e8366afd90e6" (UID: "a2f59420-9ada-4d61-a361-e8366afd90e6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.465516 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2f59420-9ada-4d61-a361-e8366afd90e6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.465763 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2f59420-9ada-4d61-a361-e8366afd90e6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:02 crc kubenswrapper[4792]: I0318 15:45:02.465839 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl64t\" (UniqueName: \"kubernetes.io/projected/a2f59420-9ada-4d61-a361-e8366afd90e6-kube-api-access-cl64t\") on node \"crc\" DevicePath \"\"" Mar 18 15:45:03 crc kubenswrapper[4792]: I0318 15:45:03.124697 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" Mar 18 15:45:03 crc kubenswrapper[4792]: I0318 15:45:03.125161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f" event={"ID":"a2f59420-9ada-4d61-a361-e8366afd90e6","Type":"ContainerDied","Data":"86fef560c49872065987858c0771166979c889d07e998ea6f69d5bd8c662aeb5"} Mar 18 15:45:03 crc kubenswrapper[4792]: I0318 15:45:03.125245 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86fef560c49872065987858c0771166979c889d07e998ea6f69d5bd8c662aeb5" Mar 18 15:45:25 crc kubenswrapper[4792]: I0318 15:45:25.971653 4792 scope.go:117] "RemoveContainer" containerID="ac0224c69f699d2cbdf0f640ce079bf0e89f74eaa87c15e9a094a39721d510d9" Mar 18 15:45:26 crc kubenswrapper[4792]: I0318 15:45:26.002110 4792 scope.go:117] "RemoveContainer" containerID="fe197a4474095149db460a6ac9ce05e600a95eca3b713cc6e9e79f5c8c379abd" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.149227 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564146-z6l4d"] Mar 18 15:46:00 crc kubenswrapper[4792]: E0318 15:46:00.150485 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f59420-9ada-4d61-a361-e8366afd90e6" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.150522 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f59420-9ada-4d61-a361-e8366afd90e6" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.150822 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f59420-9ada-4d61-a361-e8366afd90e6" containerName="collect-profiles" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.151699 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-z6l4d" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.154907 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.155152 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.155478 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.158864 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-z6l4d"] Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.174103 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzg9w\" (UniqueName: \"kubernetes.io/projected/9a82a2eb-cb6d-4ccd-b7f0-610978943926-kube-api-access-gzg9w\") pod \"auto-csr-approver-29564146-z6l4d\" (UID: \"9a82a2eb-cb6d-4ccd-b7f0-610978943926\") " pod="openshift-infra/auto-csr-approver-29564146-z6l4d" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.275327 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzg9w\" (UniqueName: \"kubernetes.io/projected/9a82a2eb-cb6d-4ccd-b7f0-610978943926-kube-api-access-gzg9w\") pod \"auto-csr-approver-29564146-z6l4d\" (UID: \"9a82a2eb-cb6d-4ccd-b7f0-610978943926\") " pod="openshift-infra/auto-csr-approver-29564146-z6l4d" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.294901 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzg9w\" (UniqueName: \"kubernetes.io/projected/9a82a2eb-cb6d-4ccd-b7f0-610978943926-kube-api-access-gzg9w\") pod \"auto-csr-approver-29564146-z6l4d\" (UID: \"9a82a2eb-cb6d-4ccd-b7f0-610978943926\") " pod="openshift-infra/auto-csr-approver-29564146-z6l4d" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.471216 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-z6l4d" Mar 18 15:46:00 crc kubenswrapper[4792]: I0318 15:46:00.643360 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-z6l4d"] Mar 18 15:46:01 crc kubenswrapper[4792]: I0318 15:46:01.502678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-z6l4d" event={"ID":"9a82a2eb-cb6d-4ccd-b7f0-610978943926","Type":"ContainerStarted","Data":"9a7b2c5cf5c3c954bcd3f8d305b2ab21ff3a8b433c71ce647c7e7a07c444e446"} Mar 18 15:46:02 crc kubenswrapper[4792]: I0318 15:46:02.511335 4792 generic.go:334] "Generic (PLEG): container finished" podID="9a82a2eb-cb6d-4ccd-b7f0-610978943926" containerID="75baa09b269232bc32b552f31ef7a606716742b9b39d7045a77c8c68788bd6da" exitCode=0 Mar 18 15:46:02 crc kubenswrapper[4792]: I0318 15:46:02.511443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-z6l4d" event={"ID":"9a82a2eb-cb6d-4ccd-b7f0-610978943926","Type":"ContainerDied","Data":"75baa09b269232bc32b552f31ef7a606716742b9b39d7045a77c8c68788bd6da"} Mar 18 15:46:03 crc kubenswrapper[4792]: I0318 15:46:03.754441 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-z6l4d" Mar 18 15:46:03 crc kubenswrapper[4792]: I0318 15:46:03.923649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzg9w\" (UniqueName: \"kubernetes.io/projected/9a82a2eb-cb6d-4ccd-b7f0-610978943926-kube-api-access-gzg9w\") pod \"9a82a2eb-cb6d-4ccd-b7f0-610978943926\" (UID: \"9a82a2eb-cb6d-4ccd-b7f0-610978943926\") " Mar 18 15:46:03 crc kubenswrapper[4792]: I0318 15:46:03.929093 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a82a2eb-cb6d-4ccd-b7f0-610978943926-kube-api-access-gzg9w" (OuterVolumeSpecName: "kube-api-access-gzg9w") pod "9a82a2eb-cb6d-4ccd-b7f0-610978943926" (UID: "9a82a2eb-cb6d-4ccd-b7f0-610978943926"). InnerVolumeSpecName "kube-api-access-gzg9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:46:04 crc kubenswrapper[4792]: I0318 15:46:04.025456 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzg9w\" (UniqueName: \"kubernetes.io/projected/9a82a2eb-cb6d-4ccd-b7f0-610978943926-kube-api-access-gzg9w\") on node \"crc\" DevicePath \"\"" Mar 18 15:46:04 crc kubenswrapper[4792]: I0318 15:46:04.527643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564146-z6l4d" event={"ID":"9a82a2eb-cb6d-4ccd-b7f0-610978943926","Type":"ContainerDied","Data":"9a7b2c5cf5c3c954bcd3f8d305b2ab21ff3a8b433c71ce647c7e7a07c444e446"} Mar 18 15:46:04 crc kubenswrapper[4792]: I0318 15:46:04.527677 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564146-z6l4d" Mar 18 15:46:04 crc kubenswrapper[4792]: I0318 15:46:04.527692 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a7b2c5cf5c3c954bcd3f8d305b2ab21ff3a8b433c71ce647c7e7a07c444e446" Mar 18 15:46:04 crc kubenswrapper[4792]: I0318 15:46:04.807603 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-x84rj"] Mar 18 15:46:04 crc kubenswrapper[4792]: I0318 15:46:04.815861 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564140-x84rj"] Mar 18 15:46:05 crc kubenswrapper[4792]: I0318 15:46:05.864355 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03d9dab-5256-4887-8fed-296b586402c3" path="/var/lib/kubelet/pods/c03d9dab-5256-4887-8fed-296b586402c3/volumes" Mar 18 15:46:26 crc kubenswrapper[4792]: I0318 15:46:26.061577 4792 scope.go:117] "RemoveContainer" containerID="0a327326d3ab67e4f51b01b4312dd3a047657baf837bfe4096ff5e6f8af40251" Mar 18 15:46:30 crc kubenswrapper[4792]: I0318 15:46:30.322323 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:46:30 crc kubenswrapper[4792]: I0318 15:46:30.322843 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.640836 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq"] Mar 18 15:46:45 crc kubenswrapper[4792]: E0318 15:46:45.641583 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a82a2eb-cb6d-4ccd-b7f0-610978943926" containerName="oc" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.641596 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a82a2eb-cb6d-4ccd-b7f0-610978943926" containerName="oc" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.641701 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a82a2eb-cb6d-4ccd-b7f0-610978943926" containerName="oc" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.642551 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.644747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.654646 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq"] Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.841705 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.841946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.842154 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jns6f\" (UniqueName: \"kubernetes.io/projected/9334cbd7-09b7-446e-ba60-975e332a9630-kube-api-access-jns6f\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.943769 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.944062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jns6f\" (UniqueName: \"kubernetes.io/projected/9334cbd7-09b7-446e-ba60-975e332a9630-kube-api-access-jns6f\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.944156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.944511 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.944712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:45 crc kubenswrapper[4792]: I0318 15:46:45.997641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jns6f\" (UniqueName: \"kubernetes.io/projected/9334cbd7-09b7-446e-ba60-975e332a9630-kube-api-access-jns6f\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:46 crc kubenswrapper[4792]: I0318 15:46:46.270608 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:46 crc kubenswrapper[4792]: I0318 15:46:46.677363 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq"] Mar 18 15:46:46 crc kubenswrapper[4792]: I0318 15:46:46.802247 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" event={"ID":"9334cbd7-09b7-446e-ba60-975e332a9630","Type":"ContainerStarted","Data":"977bc89a58266a09f2685111833a368598bf4761d134c3adf4d1365329de106b"} Mar 18 15:46:47 crc kubenswrapper[4792]: I0318 15:46:47.809866 4792 generic.go:334] "Generic (PLEG): container finished" podID="9334cbd7-09b7-446e-ba60-975e332a9630" containerID="449789d89153b9f65c6dfe358272a92d3076add81fc6a0a80bb76190dc4176b4" exitCode=0 Mar 18 15:46:47 crc kubenswrapper[4792]: I0318 15:46:47.809917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" event={"ID":"9334cbd7-09b7-446e-ba60-975e332a9630","Type":"ContainerDied","Data":"449789d89153b9f65c6dfe358272a92d3076add81fc6a0a80bb76190dc4176b4"} Mar 18 15:46:49 crc kubenswrapper[4792]: I0318 15:46:49.821511 4792 generic.go:334] "Generic (PLEG): container finished" podID="9334cbd7-09b7-446e-ba60-975e332a9630" containerID="4f4ed0fb8f7617fdfadb375a000ea5b55a0d454213223f887cc640aab728d238" exitCode=0 Mar 18 15:46:49 crc kubenswrapper[4792]: I0318 15:46:49.821554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" event={"ID":"9334cbd7-09b7-446e-ba60-975e332a9630","Type":"ContainerDied","Data":"4f4ed0fb8f7617fdfadb375a000ea5b55a0d454213223f887cc640aab728d238"} Mar 18 15:46:50 crc kubenswrapper[4792]: I0318 15:46:50.830900 4792 generic.go:334] "Generic (PLEG): container finished" podID="9334cbd7-09b7-446e-ba60-975e332a9630" containerID="a264c19fd899c23d914796d9015384b8fb3d47c933e04c6b5922c0fd703f4f2a" exitCode=0 Mar 18 15:46:50 crc kubenswrapper[4792]: I0318 15:46:50.831166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" event={"ID":"9334cbd7-09b7-446e-ba60-975e332a9630","Type":"ContainerDied","Data":"a264c19fd899c23d914796d9015384b8fb3d47c933e04c6b5922c0fd703f4f2a"} Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.065562 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.223231 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-util\") pod \"9334cbd7-09b7-446e-ba60-975e332a9630\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.223389 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-bundle\") pod \"9334cbd7-09b7-446e-ba60-975e332a9630\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.223517 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jns6f\" (UniqueName: \"kubernetes.io/projected/9334cbd7-09b7-446e-ba60-975e332a9630-kube-api-access-jns6f\") pod \"9334cbd7-09b7-446e-ba60-975e332a9630\" (UID: \"9334cbd7-09b7-446e-ba60-975e332a9630\") " Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.227597 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-bundle" (OuterVolumeSpecName: "bundle") pod "9334cbd7-09b7-446e-ba60-975e332a9630" (UID: "9334cbd7-09b7-446e-ba60-975e332a9630"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.237902 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9334cbd7-09b7-446e-ba60-975e332a9630-kube-api-access-jns6f" (OuterVolumeSpecName: "kube-api-access-jns6f") pod "9334cbd7-09b7-446e-ba60-975e332a9630" (UID: "9334cbd7-09b7-446e-ba60-975e332a9630"). InnerVolumeSpecName "kube-api-access-jns6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.253167 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-util" (OuterVolumeSpecName: "util") pod "9334cbd7-09b7-446e-ba60-975e332a9630" (UID: "9334cbd7-09b7-446e-ba60-975e332a9630"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.325876 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.325909 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9334cbd7-09b7-446e-ba60-975e332a9630-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.325919 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jns6f\" (UniqueName: \"kubernetes.io/projected/9334cbd7-09b7-446e-ba60-975e332a9630-kube-api-access-jns6f\") on node \"crc\" DevicePath \"\"" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.844773 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" event={"ID":"9334cbd7-09b7-446e-ba60-975e332a9630","Type":"ContainerDied","Data":"977bc89a58266a09f2685111833a368598bf4761d134c3adf4d1365329de106b"} Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.844823 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="977bc89a58266a09f2685111833a368598bf4761d134c3adf4d1365329de106b" Mar 18 15:46:52 crc kubenswrapper[4792]: I0318 15:46:52.844889 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq" Mar 18 15:47:00 crc kubenswrapper[4792]: I0318 15:47:00.322114 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:47:00 crc kubenswrapper[4792]: I0318 15:47:00.323776 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.319876 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4pndk"] Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.320930 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-controller" containerID="cri-o://b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.321048 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.321049 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="nbdb" containerID="cri-o://3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.321078 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="northd" containerID="cri-o://20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.321089 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-node" containerID="cri-o://6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.321098 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-acl-logging" containerID="cri-o://3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.321110 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="sbdb" containerID="cri-o://69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.359987 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" containerID="cri-o://11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217" gracePeriod=30 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.603816 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb"] Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.604074 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9334cbd7-09b7-446e-ba60-975e332a9630" containerName="pull" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.604094 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9334cbd7-09b7-446e-ba60-975e332a9630" containerName="pull" Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.604120 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9334cbd7-09b7-446e-ba60-975e332a9630" containerName="util" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.604128 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9334cbd7-09b7-446e-ba60-975e332a9630" containerName="util" Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.604138 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9334cbd7-09b7-446e-ba60-975e332a9630" containerName="extract" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.604146 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9334cbd7-09b7-446e-ba60-975e332a9630" containerName="extract" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.604267 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9334cbd7-09b7-446e-ba60-975e332a9630" containerName="extract" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.604729 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.606636 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.606681 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-h9vr5" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.607043 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.753718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhwnf\" (UniqueName: \"kubernetes.io/projected/58802970-175f-48a9-aa0b-25cbd849fecf-kube-api-access-xhwnf\") pod \"obo-prometheus-operator-8ff7d675-7m5sb\" (UID: \"58802970-175f-48a9-aa0b-25cbd849fecf\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.856314 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhwnf\" (UniqueName: \"kubernetes.io/projected/58802970-175f-48a9-aa0b-25cbd849fecf-kube-api-access-xhwnf\") pod \"obo-prometheus-operator-8ff7d675-7m5sb\" (UID: \"58802970-175f-48a9-aa0b-25cbd849fecf\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.890277 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8"] Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.891048 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.893941 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhwnf\" (UniqueName: \"kubernetes.io/projected/58802970-175f-48a9-aa0b-25cbd849fecf-kube-api-access-xhwnf\") pod \"obo-prometheus-operator-8ff7d675-7m5sb\" (UID: \"58802970-175f-48a9-aa0b-25cbd849fecf\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.895366 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-5dzsp" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.896111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.897500 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/2.log" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.897847 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/1.log" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.897897 4792 generic.go:334] "Generic (PLEG): container finished" podID="241b9e3f-bd41-4fb2-a68a-9395a67feaae" containerID="8ebf218f5e63c5d2034c6d6faf2b47bc35d407346d3717228f32e837d2a59217" exitCode=2 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.898013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerDied","Data":"8ebf218f5e63c5d2034c6d6faf2b47bc35d407346d3717228f32e837d2a59217"} Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.898069 4792 scope.go:117] "RemoveContainer" containerID="b54f749b6a2a0b60fa496e6aa34e6a00b5dcd1c891b6db681fb829757aaac914" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.898757 4792 scope.go:117] "RemoveContainer" containerID="8ebf218f5e63c5d2034c6d6faf2b47bc35d407346d3717228f32e837d2a59217" Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.899073 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fqr6h_openshift-multus(241b9e3f-bd41-4fb2-a68a-9395a67feaae)\"" pod="openshift-multus/multus-fqr6h" podUID="241b9e3f-bd41-4fb2-a68a-9395a67feaae" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.902464 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovnkube-controller/3.log" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.904769 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml"] Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.905791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.919184 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovn-acl-logging/0.log" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.920095 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.923413 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovn-controller/0.log" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925124 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217" exitCode=0 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925167 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50" exitCode=0 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925177 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793" exitCode=0 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925187 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe" exitCode=0 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925195 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89" exitCode=143 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925204 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979" exitCode=143 Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925227 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217"} Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925257 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50"} Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925270 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793"} Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe"} Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925295 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89"} Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979"} Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.925842 4792 scope.go:117] "RemoveContainer" containerID="937b71f967daf3854bb577d61a47ec98d83aa5634ff7b93f8e94ec18cab19eec" Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.958048 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(87992aa658205ea37e7fe7f77486f32543a51bd0da7f85a09c650330ee9eef74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.958121 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(87992aa658205ea37e7fe7f77486f32543a51bd0da7f85a09c650330ee9eef74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.958141 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(87992aa658205ea37e7fe7f77486f32543a51bd0da7f85a09c650330ee9eef74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:01 crc kubenswrapper[4792]: E0318 15:47:01.958179 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators(58802970-175f-48a9-aa0b-25cbd849fecf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators(58802970-175f-48a9-aa0b-25cbd849fecf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(87992aa658205ea37e7fe7f77486f32543a51bd0da7f85a09c650330ee9eef74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" podUID="58802970-175f-48a9-aa0b-25cbd849fecf" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.958382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/997642b8-111c-438c-906c-ace1a270f33b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml\" (UID: \"997642b8-111c-438c-906c-ace1a270f33b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.958484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/997642b8-111c-438c-906c-ace1a270f33b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml\" (UID: \"997642b8-111c-438c-906c-ace1a270f33b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.958559 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d350c21d-f3fd-4b9e-a5f2-d7172fb87714-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8\" (UID: \"d350c21d-f3fd-4b9e-a5f2-d7172fb87714\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:01 crc kubenswrapper[4792]: I0318 15:47:01.958639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d350c21d-f3fd-4b9e-a5f2-d7172fb87714-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8\" (UID: \"d350c21d-f3fd-4b9e-a5f2-d7172fb87714\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.059627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d350c21d-f3fd-4b9e-a5f2-d7172fb87714-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8\" (UID: \"d350c21d-f3fd-4b9e-a5f2-d7172fb87714\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.059683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d350c21d-f3fd-4b9e-a5f2-d7172fb87714-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8\" (UID: \"d350c21d-f3fd-4b9e-a5f2-d7172fb87714\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.059738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/997642b8-111c-438c-906c-ace1a270f33b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml\" (UID: \"997642b8-111c-438c-906c-ace1a270f33b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.059777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/997642b8-111c-438c-906c-ace1a270f33b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml\" (UID: \"997642b8-111c-438c-906c-ace1a270f33b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.062991 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/997642b8-111c-438c-906c-ace1a270f33b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml\" (UID: \"997642b8-111c-438c-906c-ace1a270f33b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.064573 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d350c21d-f3fd-4b9e-a5f2-d7172fb87714-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8\" (UID: \"d350c21d-f3fd-4b9e-a5f2-d7172fb87714\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.065501 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d350c21d-f3fd-4b9e-a5f2-d7172fb87714-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8\" (UID: \"d350c21d-f3fd-4b9e-a5f2-d7172fb87714\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.065905 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/997642b8-111c-438c-906c-ace1a270f33b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml\" (UID: \"997642b8-111c-438c-906c-ace1a270f33b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.215943 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.235357 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.256197 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(b164976e74b78b9afb415158c0b46d02029a872eb4ab640b2236c916e1220080): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.256276 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(b164976e74b78b9afb415158c0b46d02029a872eb4ab640b2236c916e1220080): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.256300 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(b164976e74b78b9afb415158c0b46d02029a872eb4ab640b2236c916e1220080): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.256349 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators(d350c21d-f3fd-4b9e-a5f2-d7172fb87714)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators(d350c21d-f3fd-4b9e-a5f2-d7172fb87714)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(b164976e74b78b9afb415158c0b46d02029a872eb4ab640b2236c916e1220080): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" podUID="d350c21d-f3fd-4b9e-a5f2-d7172fb87714" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.274385 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(d2ac8bb4b419db021f1bbdbab21e0f66389fade39170597454dd10ed4e7ad366): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.281377 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(d2ac8bb4b419db021f1bbdbab21e0f66389fade39170597454dd10ed4e7ad366): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.281467 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(d2ac8bb4b419db021f1bbdbab21e0f66389fade39170597454dd10ed4e7ad366): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.281590 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators(997642b8-111c-438c-906c-ace1a270f33b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators(997642b8-111c-438c-906c-ace1a270f33b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(d2ac8bb4b419db021f1bbdbab21e0f66389fade39170597454dd10ed4e7ad366): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" podUID="997642b8-111c-438c-906c-ace1a270f33b" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.303089 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-x5w94"] Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.303859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.305441 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-kpxnn" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.311244 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.364312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/255ea945-6e83-4ead-b609-b47a6b5eaafa-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-x5w94\" (UID: \"255ea945-6e83-4ead-b609-b47a6b5eaafa\") " pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.365247 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-449dh\" (UniqueName: \"kubernetes.io/projected/255ea945-6e83-4ead-b609-b47a6b5eaafa-kube-api-access-449dh\") pod \"observability-operator-6dd7dd855f-x5w94\" (UID: \"255ea945-6e83-4ead-b609-b47a6b5eaafa\") " pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.468397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-449dh\" (UniqueName: \"kubernetes.io/projected/255ea945-6e83-4ead-b609-b47a6b5eaafa-kube-api-access-449dh\") pod \"observability-operator-6dd7dd855f-x5w94\" (UID: \"255ea945-6e83-4ead-b609-b47a6b5eaafa\") " pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.468741 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/255ea945-6e83-4ead-b609-b47a6b5eaafa-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-x5w94\" (UID: \"255ea945-6e83-4ead-b609-b47a6b5eaafa\") " pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.472556 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/255ea945-6e83-4ead-b609-b47a6b5eaafa-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-x5w94\" (UID: \"255ea945-6e83-4ead-b609-b47a6b5eaafa\") " pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.485760 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-449dh\" (UniqueName: \"kubernetes.io/projected/255ea945-6e83-4ead-b609-b47a6b5eaafa-kube-api-access-449dh\") pod \"observability-operator-6dd7dd855f-x5w94\" (UID: \"255ea945-6e83-4ead-b609-b47a6b5eaafa\") " pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.544499 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovn-acl-logging/0.log" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.545149 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovn-controller/0.log" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.545591 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569385 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-ovn-kubernetes\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569472 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569550 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-netns\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569574 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-systemd\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569618 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569648 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-var-lib-openvswitch\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569662 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-systemd-units\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569711 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovn-node-metrics-cert\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569760 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-config\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.569827 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570281 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-kubelet\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570399 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-log-socket\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570447 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqswl\" (UniqueName: \"kubernetes.io/projected/4e6f23ed-13da-466a-8c55-1043d6e0b748-kube-api-access-mqswl\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570481 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-node-log\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570566 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-slash\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570606 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-script-lib\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-netd\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570686 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-ovn\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570701 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-etc-openvswitch\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570721 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-env-overrides\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570743 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-openvswitch\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.570764 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-bin\") pod \"4e6f23ed-13da-466a-8c55-1043d6e0b748\" (UID: \"4e6f23ed-13da-466a-8c55-1043d6e0b748\") " Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571023 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571038 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571047 4792 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571056 4792 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571064 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571088 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571109 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-log-socket" (OuterVolumeSpecName: "log-socket") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571420 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-node-log" (OuterVolumeSpecName: "node-log") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571495 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571567 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-slash" (OuterVolumeSpecName: "host-slash") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571964 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.572020 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.572049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.571988 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.573274 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.573578 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.592030 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6f23ed-13da-466a-8c55-1043d6e0b748-kube-api-access-mqswl" (OuterVolumeSpecName: "kube-api-access-mqswl") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "kube-api-access-mqswl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.604868 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4e6f23ed-13da-466a-8c55-1043d6e0b748" (UID: "4e6f23ed-13da-466a-8c55-1043d6e0b748"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621334 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vmvgd"] Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621621 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kubecfg-setup" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621649 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kubecfg-setup" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621668 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621681 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621698 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621709 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621721 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621728 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621742 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="nbdb" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621749 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="nbdb" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621758 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="sbdb" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621764 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="sbdb" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621775 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621782 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621789 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621796 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621812 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621820 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-acl-logging" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621836 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-acl-logging" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621845 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-node" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621851 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-node" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.621857 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="northd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.621862 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="northd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.622040 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="northd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.622058 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.622066 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-acl-logging" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623522 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovn-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623541 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623549 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623621 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623634 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-node" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623645 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623760 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="sbdb" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623771 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="nbdb" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.623457 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.625371 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.625387 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.625536 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerName="ovnkube-controller" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.627892 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.658039 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-d9577b4dd-zfrmv"] Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.670094 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(e3d1908cd0a3a9d4a12b6cea9411f32b8a8cb6fcf54a51f46ed28e6117f44b1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.670162 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(e3d1908cd0a3a9d4a12b6cea9411f32b8a8cb6fcf54a51f46ed28e6117f44b1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.670182 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(e3d1908cd0a3a9d4a12b6cea9411f32b8a8cb6fcf54a51f46ed28e6117f44b1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:02 crc kubenswrapper[4792]: E0318 15:47:02.670223 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-x5w94_openshift-operators(255ea945-6e83-4ead-b609-b47a6b5eaafa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-x5w94_openshift-operators(255ea945-6e83-4ead-b609-b47a6b5eaafa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(e3d1908cd0a3a9d4a12b6cea9411f32b8a8cb6fcf54a51f46ed28e6117f44b1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podUID="255ea945-6e83-4ead-b609-b47a6b5eaafa" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674570 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovnkube-config\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674622 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-systemd-units\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-log-socket\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674675 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-node-log\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674694 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-slash\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674713 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-cni-netd\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674737 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674758 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-kubelet\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674763 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674788 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-ovn\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674841 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-cni-bin\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-etc-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-var-lib-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674953 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-run-netns\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.674994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-env-overrides\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675037 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovn-node-metrics-cert\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675060 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovnkube-script-lib\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675080 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdjk9\" (UniqueName: \"kubernetes.io/projected/80a16394-a1e2-4e90-838f-b0475d73b5a6-kube-api-access-wdjk9\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675104 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-systemd\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675148 4792 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675162 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675175 4792 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675186 4792 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675196 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqswl\" (UniqueName: \"kubernetes.io/projected/4e6f23ed-13da-466a-8c55-1043d6e0b748-kube-api-access-mqswl\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675207 4792 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675219 4792 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675231 4792 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675243 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675254 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675264 4792 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675275 4792 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675288 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e6f23ed-13da-466a-8c55-1043d6e0b748-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675299 4792 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.675310 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e6f23ed-13da-466a-8c55-1043d6e0b748-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.680578 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.680876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-k47lh" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.777439 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovn-node-metrics-cert\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.777743 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grxxr\" (UniqueName: \"kubernetes.io/projected/15bde542-1ffd-48b4-b2cf-98d98348920e-kube-api-access-grxxr\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.777842 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15bde542-1ffd-48b4-b2cf-98d98348920e-webhook-cert\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.777949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovnkube-script-lib\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778065 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdjk9\" (UniqueName: \"kubernetes.io/projected/80a16394-a1e2-4e90-838f-b0475d73b5a6-kube-api-access-wdjk9\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778158 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-systemd\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778276 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovnkube-config\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778373 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-systemd-units\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-log-socket\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778657 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-slash\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-node-log\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-cni-netd\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.778913 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-kubelet\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779077 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-ovn\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779206 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15bde542-1ffd-48b4-b2cf-98d98348920e-apiservice-cert\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-cni-bin\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-etc-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779575 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-var-lib-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779684 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/15bde542-1ffd-48b4-b2cf-98d98348920e-openshift-service-ca\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779772 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-run-netns\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.779865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-env-overrides\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.780484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-env-overrides\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781104 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-kubelet\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781153 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-ovn\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781193 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781197 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-slash\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781194 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-log-socket\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-cni-netd\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781234 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-node-log\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781270 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-cni-bin\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-var-lib-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781308 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781341 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-run-systemd\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781382 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-host-run-netns\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781415 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-systemd-units\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovnkube-config\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.781756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/80a16394-a1e2-4e90-838f-b0475d73b5a6-etc-openvswitch\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.782029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovn-node-metrics-cert\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.782263 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/80a16394-a1e2-4e90-838f-b0475d73b5a6-ovnkube-script-lib\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.804638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdjk9\" (UniqueName: \"kubernetes.io/projected/80a16394-a1e2-4e90-838f-b0475d73b5a6-kube-api-access-wdjk9\") pod \"ovnkube-node-vmvgd\" (UID: \"80a16394-a1e2-4e90-838f-b0475d73b5a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.881415 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15bde542-1ffd-48b4-b2cf-98d98348920e-webhook-cert\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.881465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grxxr\" (UniqueName: \"kubernetes.io/projected/15bde542-1ffd-48b4-b2cf-98d98348920e-kube-api-access-grxxr\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.881545 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15bde542-1ffd-48b4-b2cf-98d98348920e-apiservice-cert\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.881602 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/15bde542-1ffd-48b4-b2cf-98d98348920e-openshift-service-ca\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.882535 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/15bde542-1ffd-48b4-b2cf-98d98348920e-openshift-service-ca\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.884835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15bde542-1ffd-48b4-b2cf-98d98348920e-webhook-cert\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.886521 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15bde542-1ffd-48b4-b2cf-98d98348920e-apiservice-cert\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.896313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grxxr\" (UniqueName: \"kubernetes.io/projected/15bde542-1ffd-48b4-b2cf-98d98348920e-kube-api-access-grxxr\") pod \"perses-operator-d9577b4dd-zfrmv\" (UID: \"15bde542-1ffd-48b4-b2cf-98d98348920e\") " pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.932495 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/2.log" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.937533 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovn-acl-logging/0.log" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.938786 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4pndk_4e6f23ed-13da-466a-8c55-1043d6e0b748/ovn-controller/0.log" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.939071 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b" exitCode=0 Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.939093 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e6f23ed-13da-466a-8c55-1043d6e0b748" containerID="6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1" exitCode=0 Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.939120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b"} Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.939142 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1"} Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.939153 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" event={"ID":"4e6f23ed-13da-466a-8c55-1043d6e0b748","Type":"ContainerDied","Data":"bd75d70e877e892950a35478814bf2eeedf8c188abd9ad1b0b4623f55b206380"} Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.939177 4792 scope.go:117] "RemoveContainer" containerID="11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.939218 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4pndk" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.957880 4792 scope.go:117] "RemoveContainer" containerID="69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.970236 4792 scope.go:117] "RemoveContainer" containerID="3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.971791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.986190 4792 scope.go:117] "RemoveContainer" containerID="20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe" Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.987298 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4pndk"] Mar 18 15:47:02 crc kubenswrapper[4792]: I0318 15:47:02.993113 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4pndk"] Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.012897 4792 scope.go:117] "RemoveContainer" containerID="5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.018500 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.045146 4792 scope.go:117] "RemoveContainer" containerID="6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.060023 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(64c840833f8847d4d51ecc10596ea3d77bf0f517f539b645760e28fc6caa05d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.060089 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(64c840833f8847d4d51ecc10596ea3d77bf0f517f539b645760e28fc6caa05d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.060109 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(64c840833f8847d4d51ecc10596ea3d77bf0f517f539b645760e28fc6caa05d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.060149 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-d9577b4dd-zfrmv_openshift-operators(15bde542-1ffd-48b4-b2cf-98d98348920e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-d9577b4dd-zfrmv_openshift-operators(15bde542-1ffd-48b4-b2cf-98d98348920e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(64c840833f8847d4d51ecc10596ea3d77bf0f517f539b645760e28fc6caa05d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.075212 4792 scope.go:117] "RemoveContainer" containerID="3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.101123 4792 scope.go:117] "RemoveContainer" containerID="b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.131368 4792 scope.go:117] "RemoveContainer" containerID="ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.152855 4792 scope.go:117] "RemoveContainer" containerID="11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.153240 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217\": container with ID starting with 11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217 not found: ID does not exist" containerID="11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.153274 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217"} err="failed to get container status \"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217\": rpc error: code = NotFound desc = could not find container \"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217\": container with ID starting with 11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.153317 4792 scope.go:117] "RemoveContainer" containerID="69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.153710 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\": container with ID starting with 69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50 not found: ID does not exist" containerID="69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.153734 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50"} err="failed to get container status \"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\": rpc error: code = NotFound desc = could not find container \"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\": container with ID starting with 69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.153750 4792 scope.go:117] "RemoveContainer" containerID="3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.154088 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\": container with ID starting with 3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793 not found: ID does not exist" containerID="3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.154113 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793"} err="failed to get container status \"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\": rpc error: code = NotFound desc = could not find container \"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\": container with ID starting with 3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.154131 4792 scope.go:117] "RemoveContainer" containerID="20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.154740 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\": container with ID starting with 20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe not found: ID does not exist" containerID="20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.154765 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe"} err="failed to get container status \"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\": rpc error: code = NotFound desc = could not find container \"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\": container with ID starting with 20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.154782 4792 scope.go:117] "RemoveContainer" containerID="5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.155080 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\": container with ID starting with 5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b not found: ID does not exist" containerID="5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155105 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b"} err="failed to get container status \"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\": rpc error: code = NotFound desc = could not find container \"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\": container with ID starting with 5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155119 4792 scope.go:117] "RemoveContainer" containerID="6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.155383 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\": container with ID starting with 6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1 not found: ID does not exist" containerID="6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155406 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1"} err="failed to get container status \"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\": rpc error: code = NotFound desc = could not find container \"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\": container with ID starting with 6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155427 4792 scope.go:117] "RemoveContainer" containerID="3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.155641 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\": container with ID starting with 3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89 not found: ID does not exist" containerID="3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155672 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89"} err="failed to get container status \"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\": rpc error: code = NotFound desc = could not find container \"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\": container with ID starting with 3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155688 4792 scope.go:117] "RemoveContainer" containerID="b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.155899 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\": container with ID starting with b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979 not found: ID does not exist" containerID="b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155923 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979"} err="failed to get container status \"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\": rpc error: code = NotFound desc = could not find container \"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\": container with ID starting with b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.155989 4792 scope.go:117] "RemoveContainer" containerID="ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9" Mar 18 15:47:03 crc kubenswrapper[4792]: E0318 15:47:03.158837 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\": container with ID starting with ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9 not found: ID does not exist" containerID="ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.158906 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9"} err="failed to get container status \"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\": rpc error: code = NotFound desc = could not find container \"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\": container with ID starting with ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.158937 4792 scope.go:117] "RemoveContainer" containerID="11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.162647 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217"} err="failed to get container status \"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217\": rpc error: code = NotFound desc = could not find container \"11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217\": container with ID starting with 11cae6cdb7fad140e022ce6b99ebd7591af5257cc981b1f46141d38400668217 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.162694 4792 scope.go:117] "RemoveContainer" containerID="69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.162989 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50"} err="failed to get container status \"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\": rpc error: code = NotFound desc = could not find container \"69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50\": container with ID starting with 69895fbbe87be07cd211d2d199c93f9f2579bcea23342759c29982257d1e5e50 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.163008 4792 scope.go:117] "RemoveContainer" containerID="3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.163206 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793"} err="failed to get container status \"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\": rpc error: code = NotFound desc = could not find container \"3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793\": container with ID starting with 3dc88d72c4bb938c6488f87c9920f1061fc8b4a776c658790cfb476a1f6b4793 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.163223 4792 scope.go:117] "RemoveContainer" containerID="20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.167113 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe"} err="failed to get container status \"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\": rpc error: code = NotFound desc = could not find container \"20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe\": container with ID starting with 20e040ff3bdf0290b77886cd4c4d56737ca9421c3e5fd3551dc84717c3bf11fe not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.167150 4792 scope.go:117] "RemoveContainer" containerID="5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.167511 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b"} err="failed to get container status \"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\": rpc error: code = NotFound desc = could not find container \"5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b\": container with ID starting with 5994f6dfa9140a1b6cd2e4003ec2428f924dbe8926ad6648c1d913e7376c5c2b not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.167540 4792 scope.go:117] "RemoveContainer" containerID="6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.167846 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1"} err="failed to get container status \"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\": rpc error: code = NotFound desc = could not find container \"6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1\": container with ID starting with 6dac5b04b18d12529d8a0d6608aae17c8516e2971f932914aab75930c63593b1 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.167872 4792 scope.go:117] "RemoveContainer" containerID="3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.168141 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89"} err="failed to get container status \"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\": rpc error: code = NotFound desc = could not find container \"3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89\": container with ID starting with 3e5513a42f187116abb13ab99a93ff7eb4a2cbe7f29fd921e9277d78bc08ca89 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.168179 4792 scope.go:117] "RemoveContainer" containerID="b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.168426 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979"} err="failed to get container status \"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\": rpc error: code = NotFound desc = could not find container \"b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979\": container with ID starting with b408a36d24adcd9fa8993f0fc948c93a37b59b0ddf8fac341897e310de3b6979 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.168460 4792 scope.go:117] "RemoveContainer" containerID="ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.168772 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9"} err="failed to get container status \"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\": rpc error: code = NotFound desc = could not find container \"ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9\": container with ID starting with ec1f66a48db36fa369b8eb9e3819e21b82dbbaf7c260c718840c6f4d4fdf38a9 not found: ID does not exist" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.864090 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6f23ed-13da-466a-8c55-1043d6e0b748" path="/var/lib/kubelet/pods/4e6f23ed-13da-466a-8c55-1043d6e0b748/volumes" Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.948790 4792 generic.go:334] "Generic (PLEG): container finished" podID="80a16394-a1e2-4e90-838f-b0475d73b5a6" containerID="785e4e94f39397aeadc73bf1d7877815c304fea31a31d6e92a71909ba3a6751c" exitCode=0 Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.948897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerDied","Data":"785e4e94f39397aeadc73bf1d7877815c304fea31a31d6e92a71909ba3a6751c"} Mar 18 15:47:03 crc kubenswrapper[4792]: I0318 15:47:03.949007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"9e8817f8215fb739588b14395492573055c62d96238ebb65e6ad8e25323f5852"} Mar 18 15:47:04 crc kubenswrapper[4792]: I0318 15:47:04.960656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"332c0f3034113fc63b4d1c34f513ec474d91f2d5387996d9e9a7ff3e71a6382c"} Mar 18 15:47:04 crc kubenswrapper[4792]: I0318 15:47:04.961204 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"d7910cdc0ee966629eb1187e3404f5a9a187246dd696c3faef9a266241887d16"} Mar 18 15:47:04 crc kubenswrapper[4792]: I0318 15:47:04.961216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"0f0edfb73b6390ae2acf5d3273a4533ef55f62bcc72aea427f85c2e179e93c25"} Mar 18 15:47:04 crc kubenswrapper[4792]: I0318 15:47:04.961227 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"8517b2fe68720e6a1e9b1ee33cc2eff66b44eec57d5aaf9913dbc95ba375638e"} Mar 18 15:47:04 crc kubenswrapper[4792]: I0318 15:47:04.961239 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"758bbb6de47c1b28396bb492c698173e7b72f1470bd7b01b1fe4ae24ae8d8a9b"} Mar 18 15:47:04 crc kubenswrapper[4792]: I0318 15:47:04.961249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"35377b0cdc817ed6a64b70b8a7c876dcd0a426adcd8b8fa19e5e6863a0a5e0d3"} Mar 18 15:47:06 crc kubenswrapper[4792]: I0318 15:47:06.973577 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"180209d1c9baed6794d6a0df949469a8ac96bf915bd7ce4734992da4eae81556"} Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.002926 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" event={"ID":"80a16394-a1e2-4e90-838f-b0475d73b5a6","Type":"ContainerStarted","Data":"3281def7c92f8b700aa4e5a228546caec418312b1afe4b7e2b53a6781bb13c6a"} Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.003478 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.028744 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-x5w94"] Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.028870 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.029421 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.045029 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-d9577b4dd-zfrmv"] Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.045190 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.045649 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.046360 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" podStartSLOduration=8.046342783 podStartE2EDuration="8.046342783s" podCreationTimestamp="2026-03-18 15:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:47:10.032762032 +0000 UTC m=+778.902090969" watchObservedRunningTime="2026-03-18 15:47:10.046342783 +0000 UTC m=+778.915671720" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.061229 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.064049 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml"] Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.064207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.064693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.067360 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8"] Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.067517 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.068021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.070126 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(49f7a078b5de092633316cb5396446f8e56909ca4708f6b2cc050507f88191bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.070203 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(49f7a078b5de092633316cb5396446f8e56909ca4708f6b2cc050507f88191bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.070248 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(49f7a078b5de092633316cb5396446f8e56909ca4708f6b2cc050507f88191bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.070286 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-x5w94_openshift-operators(255ea945-6e83-4ead-b609-b47a6b5eaafa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-x5w94_openshift-operators(255ea945-6e83-4ead-b609-b47a6b5eaafa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(49f7a078b5de092633316cb5396446f8e56909ca4708f6b2cc050507f88191bb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podUID="255ea945-6e83-4ead-b609-b47a6b5eaafa" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.089955 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb"] Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.090082 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:10 crc kubenswrapper[4792]: I0318 15:47:10.090474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.140524 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(bb88c098cc0a0baf2b3dcead06584db49305957e9d397478e19c8b97e243c1c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.140593 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(bb88c098cc0a0baf2b3dcead06584db49305957e9d397478e19c8b97e243c1c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.140619 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(bb88c098cc0a0baf2b3dcead06584db49305957e9d397478e19c8b97e243c1c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.140676 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-d9577b4dd-zfrmv_openshift-operators(15bde542-1ffd-48b4-b2cf-98d98348920e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-d9577b4dd-zfrmv_openshift-operators(15bde542-1ffd-48b4-b2cf-98d98348920e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(bb88c098cc0a0baf2b3dcead06584db49305957e9d397478e19c8b97e243c1c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.154798 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(07872c161dc37c9ba91f76fea29861b692f386df8df39bc310b310b2eb8d09f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.154875 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(07872c161dc37c9ba91f76fea29861b692f386df8df39bc310b310b2eb8d09f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.154906 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(07872c161dc37c9ba91f76fea29861b692f386df8df39bc310b310b2eb8d09f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.154967 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators(997642b8-111c-438c-906c-ace1a270f33b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators(997642b8-111c-438c-906c-ace1a270f33b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(07872c161dc37c9ba91f76fea29861b692f386df8df39bc310b310b2eb8d09f0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" podUID="997642b8-111c-438c-906c-ace1a270f33b" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.161383 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(f5dbe32467d87c17dca078565b637c1393aeed6078abd8b88c9c2e140944ae9b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.161440 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(f5dbe32467d87c17dca078565b637c1393aeed6078abd8b88c9c2e140944ae9b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.161458 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(f5dbe32467d87c17dca078565b637c1393aeed6078abd8b88c9c2e140944ae9b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.161496 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators(58802970-175f-48a9-aa0b-25cbd849fecf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators(58802970-175f-48a9-aa0b-25cbd849fecf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(f5dbe32467d87c17dca078565b637c1393aeed6078abd8b88c9c2e140944ae9b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" podUID="58802970-175f-48a9-aa0b-25cbd849fecf" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.170321 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(a79f324a8f05b531719c9c148c5c04798bfff15229ac7f7f281dd0fb97db57df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.170374 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(a79f324a8f05b531719c9c148c5c04798bfff15229ac7f7f281dd0fb97db57df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.170397 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(a79f324a8f05b531719c9c148c5c04798bfff15229ac7f7f281dd0fb97db57df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:10 crc kubenswrapper[4792]: E0318 15:47:10.170432 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators(d350c21d-f3fd-4b9e-a5f2-d7172fb87714)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators(d350c21d-f3fd-4b9e-a5f2-d7172fb87714)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(a79f324a8f05b531719c9c148c5c04798bfff15229ac7f7f281dd0fb97db57df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" podUID="d350c21d-f3fd-4b9e-a5f2-d7172fb87714" Mar 18 15:47:11 crc kubenswrapper[4792]: I0318 15:47:11.008931 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:11 crc kubenswrapper[4792]: I0318 15:47:11.009335 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:11 crc kubenswrapper[4792]: I0318 15:47:11.059605 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:15 crc kubenswrapper[4792]: I0318 15:47:15.854168 4792 scope.go:117] "RemoveContainer" containerID="8ebf218f5e63c5d2034c6d6faf2b47bc35d407346d3717228f32e837d2a59217" Mar 18 15:47:15 crc kubenswrapper[4792]: E0318 15:47:15.854851 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fqr6h_openshift-multus(241b9e3f-bd41-4fb2-a68a-9395a67feaae)\"" pod="openshift-multus/multus-fqr6h" podUID="241b9e3f-bd41-4fb2-a68a-9395a67feaae" Mar 18 15:47:20 crc kubenswrapper[4792]: I0318 15:47:20.853467 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:20 crc kubenswrapper[4792]: I0318 15:47:20.854297 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:20 crc kubenswrapper[4792]: E0318 15:47:20.892109 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(0bc47827713d353711ae7e8f83a12a748235324d7d62386a37de861ff8ceb1df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:20 crc kubenswrapper[4792]: E0318 15:47:20.892497 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(0bc47827713d353711ae7e8f83a12a748235324d7d62386a37de861ff8ceb1df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:20 crc kubenswrapper[4792]: E0318 15:47:20.892525 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(0bc47827713d353711ae7e8f83a12a748235324d7d62386a37de861ff8ceb1df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:20 crc kubenswrapper[4792]: E0318 15:47:20.892583 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-d9577b4dd-zfrmv_openshift-operators(15bde542-1ffd-48b4-b2cf-98d98348920e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-d9577b4dd-zfrmv_openshift-operators(15bde542-1ffd-48b4-b2cf-98d98348920e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-d9577b4dd-zfrmv_openshift-operators_15bde542-1ffd-48b4-b2cf-98d98348920e_0(0bc47827713d353711ae7e8f83a12a748235324d7d62386a37de861ff8ceb1df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" Mar 18 15:47:21 crc kubenswrapper[4792]: I0318 15:47:21.854018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:21 crc kubenswrapper[4792]: I0318 15:47:21.857031 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:21 crc kubenswrapper[4792]: E0318 15:47:21.880372 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(d9232e5eb1d0bfe3cccd5224b0674d43550ea1e44b5fadf624e5182402ef7dcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:21 crc kubenswrapper[4792]: E0318 15:47:21.880414 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(d9232e5eb1d0bfe3cccd5224b0674d43550ea1e44b5fadf624e5182402ef7dcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:21 crc kubenswrapper[4792]: E0318 15:47:21.880434 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(d9232e5eb1d0bfe3cccd5224b0674d43550ea1e44b5fadf624e5182402ef7dcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:21 crc kubenswrapper[4792]: E0318 15:47:21.880473 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators(d350c21d-f3fd-4b9e-a5f2-d7172fb87714)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators(d350c21d-f3fd-4b9e-a5f2-d7172fb87714)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_openshift-operators_d350c21d-f3fd-4b9e-a5f2-d7172fb87714_0(d9232e5eb1d0bfe3cccd5224b0674d43550ea1e44b5fadf624e5182402ef7dcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" podUID="d350c21d-f3fd-4b9e-a5f2-d7172fb87714" Mar 18 15:47:22 crc kubenswrapper[4792]: I0318 15:47:22.854162 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:22 crc kubenswrapper[4792]: I0318 15:47:22.854621 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:22 crc kubenswrapper[4792]: I0318 15:47:22.854876 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:22 crc kubenswrapper[4792]: I0318 15:47:22.855352 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.888232 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(6c528da11abe0265a8fcc3c74e6f614d34a7cf632e00e899731a1b25c3ec4483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.888399 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(6c528da11abe0265a8fcc3c74e6f614d34a7cf632e00e899731a1b25c3ec4483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.888424 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(6c528da11abe0265a8fcc3c74e6f614d34a7cf632e00e899731a1b25c3ec4483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.888493 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators(58802970-175f-48a9-aa0b-25cbd849fecf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators(58802970-175f-48a9-aa0b-25cbd849fecf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-7m5sb_openshift-operators_58802970-175f-48a9-aa0b-25cbd849fecf_0(6c528da11abe0265a8fcc3c74e6f614d34a7cf632e00e899731a1b25c3ec4483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" podUID="58802970-175f-48a9-aa0b-25cbd849fecf" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.890249 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(0b7ce12d78c703db91fe3c51123e2d570f889b78eb213b89fecc0520eb82c8d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.890306 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(0b7ce12d78c703db91fe3c51123e2d570f889b78eb213b89fecc0520eb82c8d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.890325 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(0b7ce12d78c703db91fe3c51123e2d570f889b78eb213b89fecc0520eb82c8d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:22 crc kubenswrapper[4792]: E0318 15:47:22.890373 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-x5w94_openshift-operators(255ea945-6e83-4ead-b609-b47a6b5eaafa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-x5w94_openshift-operators(255ea945-6e83-4ead-b609-b47a6b5eaafa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-x5w94_openshift-operators_255ea945-6e83-4ead-b609-b47a6b5eaafa_0(0b7ce12d78c703db91fe3c51123e2d570f889b78eb213b89fecc0520eb82c8d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podUID="255ea945-6e83-4ead-b609-b47a6b5eaafa" Mar 18 15:47:23 crc kubenswrapper[4792]: I0318 15:47:23.853620 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:23 crc kubenswrapper[4792]: I0318 15:47:23.854440 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:23 crc kubenswrapper[4792]: E0318 15:47:23.876223 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(b482615f04ed177a844f49f979c530c25f67d68abd310d112e865c01f58d4258): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 15:47:23 crc kubenswrapper[4792]: E0318 15:47:23.876280 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(b482615f04ed177a844f49f979c530c25f67d68abd310d112e865c01f58d4258): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:23 crc kubenswrapper[4792]: E0318 15:47:23.876299 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(b482615f04ed177a844f49f979c530c25f67d68abd310d112e865c01f58d4258): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:23 crc kubenswrapper[4792]: E0318 15:47:23.876343 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators(997642b8-111c-438c-906c-ace1a270f33b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators(997642b8-111c-438c-906c-ace1a270f33b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_openshift-operators_997642b8-111c-438c-906c-ace1a270f33b_0(b482615f04ed177a844f49f979c530c25f67d68abd310d112e865c01f58d4258): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" podUID="997642b8-111c-438c-906c-ace1a270f33b" Mar 18 15:47:27 crc kubenswrapper[4792]: I0318 15:47:27.853850 4792 scope.go:117] "RemoveContainer" containerID="8ebf218f5e63c5d2034c6d6faf2b47bc35d407346d3717228f32e837d2a59217" Mar 18 15:47:28 crc kubenswrapper[4792]: I0318 15:47:28.096326 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fqr6h_241b9e3f-bd41-4fb2-a68a-9395a67feaae/kube-multus/2.log" Mar 18 15:47:28 crc kubenswrapper[4792]: I0318 15:47:28.096625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fqr6h" event={"ID":"241b9e3f-bd41-4fb2-a68a-9395a67feaae","Type":"ContainerStarted","Data":"572f6701b44aa55b70ac8fb15e7fe9f4ff9846e1a712d8ead742204dbd99753a"} Mar 18 15:47:30 crc kubenswrapper[4792]: I0318 15:47:30.321533 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:47:30 crc kubenswrapper[4792]: I0318 15:47:30.321810 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:47:30 crc kubenswrapper[4792]: I0318 15:47:30.321857 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:47:30 crc kubenswrapper[4792]: I0318 15:47:30.322446 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff8ccb857e3db00243a27bea97ff36a63ee591214974ddeca1d87a9822ae051d"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:47:30 crc kubenswrapper[4792]: I0318 15:47:30.322495 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://ff8ccb857e3db00243a27bea97ff36a63ee591214974ddeca1d87a9822ae051d" gracePeriod=600 Mar 18 15:47:31 crc kubenswrapper[4792]: I0318 15:47:31.119606 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="ff8ccb857e3db00243a27bea97ff36a63ee591214974ddeca1d87a9822ae051d" exitCode=0 Mar 18 15:47:31 crc kubenswrapper[4792]: I0318 15:47:31.119674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"ff8ccb857e3db00243a27bea97ff36a63ee591214974ddeca1d87a9822ae051d"} Mar 18 15:47:31 crc kubenswrapper[4792]: I0318 15:47:31.120121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"f33d47ada8d06b2ac36e49a83c544decfab94cfff67a570889a75d2335bcd957"} Mar 18 15:47:31 crc kubenswrapper[4792]: I0318 15:47:31.120143 4792 scope.go:117] "RemoveContainer" containerID="370a0610f69f90c455b9de196ad13a03d25979aee2067519bf5c4577cf759efe" Mar 18 15:47:31 crc kubenswrapper[4792]: I0318 15:47:31.853258 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:31 crc kubenswrapper[4792]: I0318 15:47:31.857817 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:32 crc kubenswrapper[4792]: I0318 15:47:32.256048 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-d9577b4dd-zfrmv"] Mar 18 15:47:33 crc kubenswrapper[4792]: I0318 15:47:33.015585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vmvgd" Mar 18 15:47:33 crc kubenswrapper[4792]: I0318 15:47:33.135913 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" event={"ID":"15bde542-1ffd-48b4-b2cf-98d98348920e","Type":"ContainerStarted","Data":"adc908c9d663e8f3ff2a260bacb34cca88a7e2917304bb61c46526444651aee1"} Mar 18 15:47:34 crc kubenswrapper[4792]: I0318 15:47:34.853922 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:34 crc kubenswrapper[4792]: I0318 15:47:34.855018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" Mar 18 15:47:35 crc kubenswrapper[4792]: I0318 15:47:35.300380 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml"] Mar 18 15:47:35 crc kubenswrapper[4792]: I0318 15:47:35.854329 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:35 crc kubenswrapper[4792]: I0318 15:47:35.855231 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" Mar 18 15:47:36 crc kubenswrapper[4792]: W0318 15:47:36.436129 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod997642b8_111c_438c_906c_ace1a270f33b.slice/crio-b725b2536972bf65df8ae807a9b5ea1b5147916c1a2e8f9fd90609703ba65e9c WatchSource:0}: Error finding container b725b2536972bf65df8ae807a9b5ea1b5147916c1a2e8f9fd90609703ba65e9c: Status 404 returned error can't find the container with id b725b2536972bf65df8ae807a9b5ea1b5147916c1a2e8f9fd90609703ba65e9c Mar 18 15:47:36 crc kubenswrapper[4792]: I0318 15:47:36.846536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8"] Mar 18 15:47:36 crc kubenswrapper[4792]: I0318 15:47:36.853687 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:36 crc kubenswrapper[4792]: I0318 15:47:36.853733 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:36 crc kubenswrapper[4792]: I0318 15:47:36.854227 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" Mar 18 15:47:36 crc kubenswrapper[4792]: I0318 15:47:36.854282 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.081820 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-x5w94"] Mar 18 15:47:37 crc kubenswrapper[4792]: W0318 15:47:37.082983 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255ea945_6e83_4ead_b609_b47a6b5eaafa.slice/crio-bd548f55515d636dde277fb76712be097e8f8296a1024d394ac00921c7b0fb27 WatchSource:0}: Error finding container bd548f55515d636dde277fb76712be097e8f8296a1024d394ac00921c7b0fb27: Status 404 returned error can't find the container with id bd548f55515d636dde277fb76712be097e8f8296a1024d394ac00921c7b0fb27 Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.139661 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb"] Mar 18 15:47:37 crc kubenswrapper[4792]: W0318 15:47:37.145239 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58802970_175f_48a9_aa0b_25cbd849fecf.slice/crio-bccf1c38600d441ddd00735734647b8bf5c1521196c30eec4feef14475c29c02 WatchSource:0}: Error finding container bccf1c38600d441ddd00735734647b8bf5c1521196c30eec4feef14475c29c02: Status 404 returned error can't find the container with id bccf1c38600d441ddd00735734647b8bf5c1521196c30eec4feef14475c29c02 Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.155860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" event={"ID":"997642b8-111c-438c-906c-ace1a270f33b","Type":"ContainerStarted","Data":"b725b2536972bf65df8ae807a9b5ea1b5147916c1a2e8f9fd90609703ba65e9c"} Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.161441 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" event={"ID":"58802970-175f-48a9-aa0b-25cbd849fecf","Type":"ContainerStarted","Data":"bccf1c38600d441ddd00735734647b8bf5c1521196c30eec4feef14475c29c02"} Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.162585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" event={"ID":"d350c21d-f3fd-4b9e-a5f2-d7172fb87714","Type":"ContainerStarted","Data":"328069ed7fb1dd6cf826fbba3550803152ec5f5d855266e9f10290844da6a2d3"} Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.164872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" event={"ID":"15bde542-1ffd-48b4-b2cf-98d98348920e","Type":"ContainerStarted","Data":"819b5d62940c8b4513ea0d8aa7e145a915e52b0c3c06acb318ce78c679386ab6"} Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.164939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.166009 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" event={"ID":"255ea945-6e83-4ead-b609-b47a6b5eaafa","Type":"ContainerStarted","Data":"bd548f55515d636dde277fb76712be097e8f8296a1024d394ac00921c7b0fb27"} Mar 18 15:47:37 crc kubenswrapper[4792]: I0318 15:47:37.181863 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podStartSLOduration=30.919201141 podStartE2EDuration="35.181840595s" podCreationTimestamp="2026-03-18 15:47:02 +0000 UTC" firstStartedPulling="2026-03-18 15:47:32.267828895 +0000 UTC m=+801.137157832" lastFinishedPulling="2026-03-18 15:47:36.530468349 +0000 UTC m=+805.399797286" observedRunningTime="2026-03-18 15:47:37.178509769 +0000 UTC m=+806.047838706" watchObservedRunningTime="2026-03-18 15:47:37.181840595 +0000 UTC m=+806.051169552" Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.196793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" event={"ID":"997642b8-111c-438c-906c-ace1a270f33b","Type":"ContainerStarted","Data":"5dcc6f6b14d8a52d22f61402292d60fe489b9eb05f41e9b9dc199b1a98e8b8fd"} Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.198439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" event={"ID":"58802970-175f-48a9-aa0b-25cbd849fecf","Type":"ContainerStarted","Data":"23f0c3931b601cfebbbee806edca2a80efe999d8a63ba65e383d65fed039aba8"} Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.201901 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" event={"ID":"d350c21d-f3fd-4b9e-a5f2-d7172fb87714","Type":"ContainerStarted","Data":"512fe163824f359d8d5f5975fa58ef88900241d68fd16122473b59ec01c92082"} Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.203039 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" event={"ID":"255ea945-6e83-4ead-b609-b47a6b5eaafa","Type":"ContainerStarted","Data":"d806e635f33980ab759c5237452236e0f1cdb9ef66f918326f252a397e48dbe0"} Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.203112 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.204685 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.215396 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-27dml" podStartSLOduration=35.920032597 podStartE2EDuration="41.215376598s" podCreationTimestamp="2026-03-18 15:47:01 +0000 UTC" firstStartedPulling="2026-03-18 15:47:36.440678104 +0000 UTC m=+805.310007051" lastFinishedPulling="2026-03-18 15:47:41.736022115 +0000 UTC m=+810.605351052" observedRunningTime="2026-03-18 15:47:42.21335184 +0000 UTC m=+811.082680807" watchObservedRunningTime="2026-03-18 15:47:42.215376598 +0000 UTC m=+811.084705535" Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.248409 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7m5sb" podStartSLOduration=36.662237528 podStartE2EDuration="41.248391053s" podCreationTimestamp="2026-03-18 15:47:01 +0000 UTC" firstStartedPulling="2026-03-18 15:47:37.147039989 +0000 UTC m=+806.016368926" lastFinishedPulling="2026-03-18 15:47:41.733193514 +0000 UTC m=+810.602522451" observedRunningTime="2026-03-18 15:47:42.246481657 +0000 UTC m=+811.115810614" watchObservedRunningTime="2026-03-18 15:47:42.248391053 +0000 UTC m=+811.117719990" Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.285574 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podStartSLOduration=35.615030014 podStartE2EDuration="40.285552917s" podCreationTimestamp="2026-03-18 15:47:02 +0000 UTC" firstStartedPulling="2026-03-18 15:47:37.08652792 +0000 UTC m=+805.955856857" lastFinishedPulling="2026-03-18 15:47:41.757050813 +0000 UTC m=+810.626379760" observedRunningTime="2026-03-18 15:47:42.278856154 +0000 UTC m=+811.148185091" watchObservedRunningTime="2026-03-18 15:47:42.285552917 +0000 UTC m=+811.154881864" Mar 18 15:47:42 crc kubenswrapper[4792]: I0318 15:47:42.310685 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8" podStartSLOduration=36.48030788 podStartE2EDuration="41.310659602s" podCreationTimestamp="2026-03-18 15:47:01 +0000 UTC" firstStartedPulling="2026-03-18 15:47:36.84984825 +0000 UTC m=+805.719177187" lastFinishedPulling="2026-03-18 15:47:41.680199972 +0000 UTC m=+810.549528909" observedRunningTime="2026-03-18 15:47:42.306463301 +0000 UTC m=+811.175792238" watchObservedRunningTime="2026-03-18 15:47:42.310659602 +0000 UTC m=+811.179988549" Mar 18 15:47:43 crc kubenswrapper[4792]: I0318 15:47:43.021375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.192802 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dqww9"] Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.194514 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.198607 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.198888 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qr8cv" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.202999 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.204282 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dqww9"] Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.208560 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-k4xld"] Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.209389 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-k4xld" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.211073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dw4gh" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.229961 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dbbd4"] Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.230884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.232519 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qpzpn" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.236328 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-k4xld"] Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.263501 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dbbd4"] Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.371217 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf84g\" (UniqueName: \"kubernetes.io/projected/7d4badb4-1388-47c8-aed9-f8478388af41-kube-api-access-wf84g\") pod \"cert-manager-webhook-687f57d79b-dbbd4\" (UID: \"7d4badb4-1388-47c8-aed9-f8478388af41\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.371266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lptsj\" (UniqueName: \"kubernetes.io/projected/4e1f9db4-54f6-4217-a3cb-e9ea440f186e-kube-api-access-lptsj\") pod \"cert-manager-cainjector-cf98fcc89-dqww9\" (UID: \"4e1f9db4-54f6-4217-a3cb-e9ea440f186e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.371303 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6nw\" (UniqueName: \"kubernetes.io/projected/4766d1f1-5397-48cd-8429-daa6bc26a860-kube-api-access-kd6nw\") pod \"cert-manager-858654f9db-k4xld\" (UID: \"4766d1f1-5397-48cd-8429-daa6bc26a860\") " pod="cert-manager/cert-manager-858654f9db-k4xld" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.473158 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf84g\" (UniqueName: \"kubernetes.io/projected/7d4badb4-1388-47c8-aed9-f8478388af41-kube-api-access-wf84g\") pod \"cert-manager-webhook-687f57d79b-dbbd4\" (UID: \"7d4badb4-1388-47c8-aed9-f8478388af41\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.473769 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lptsj\" (UniqueName: \"kubernetes.io/projected/4e1f9db4-54f6-4217-a3cb-e9ea440f186e-kube-api-access-lptsj\") pod \"cert-manager-cainjector-cf98fcc89-dqww9\" (UID: \"4e1f9db4-54f6-4217-a3cb-e9ea440f186e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.473923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6nw\" (UniqueName: \"kubernetes.io/projected/4766d1f1-5397-48cd-8429-daa6bc26a860-kube-api-access-kd6nw\") pod \"cert-manager-858654f9db-k4xld\" (UID: \"4766d1f1-5397-48cd-8429-daa6bc26a860\") " pod="cert-manager/cert-manager-858654f9db-k4xld" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.491442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6nw\" (UniqueName: \"kubernetes.io/projected/4766d1f1-5397-48cd-8429-daa6bc26a860-kube-api-access-kd6nw\") pod \"cert-manager-858654f9db-k4xld\" (UID: \"4766d1f1-5397-48cd-8429-daa6bc26a860\") " pod="cert-manager/cert-manager-858654f9db-k4xld" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.492004 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf84g\" (UniqueName: \"kubernetes.io/projected/7d4badb4-1388-47c8-aed9-f8478388af41-kube-api-access-wf84g\") pod \"cert-manager-webhook-687f57d79b-dbbd4\" (UID: \"7d4badb4-1388-47c8-aed9-f8478388af41\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.493033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lptsj\" (UniqueName: \"kubernetes.io/projected/4e1f9db4-54f6-4217-a3cb-e9ea440f186e-kube-api-access-lptsj\") pod \"cert-manager-cainjector-cf98fcc89-dqww9\" (UID: \"4e1f9db4-54f6-4217-a3cb-e9ea440f186e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.513228 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.527294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-k4xld" Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.542642 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" Mar 18 15:47:49 crc kubenswrapper[4792]: W0318 15:47:49.787752 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e1f9db4_54f6_4217_a3cb_e9ea440f186e.slice/crio-e8fba33c79bb8abc0282e3728e3a89278bc267fe4e21984d0488743a931968bd WatchSource:0}: Error finding container e8fba33c79bb8abc0282e3728e3a89278bc267fe4e21984d0488743a931968bd: Status 404 returned error can't find the container with id e8fba33c79bb8abc0282e3728e3a89278bc267fe4e21984d0488743a931968bd Mar 18 15:47:49 crc kubenswrapper[4792]: I0318 15:47:49.789726 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dqww9"] Mar 18 15:47:50 crc kubenswrapper[4792]: W0318 15:47:50.084307 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d4badb4_1388_47c8_aed9_f8478388af41.slice/crio-554ad30b3e186821335ad34eb86d87c9d6936c41840211ef2aae93270fb9e0ec WatchSource:0}: Error finding container 554ad30b3e186821335ad34eb86d87c9d6936c41840211ef2aae93270fb9e0ec: Status 404 returned error can't find the container with id 554ad30b3e186821335ad34eb86d87c9d6936c41840211ef2aae93270fb9e0ec Mar 18 15:47:50 crc kubenswrapper[4792]: W0318 15:47:50.086903 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4766d1f1_5397_48cd_8429_daa6bc26a860.slice/crio-6a0eeea84ac7f39ebf6e305c6c84559934421526a7451be2a6d16851f799a628 WatchSource:0}: Error finding container 6a0eeea84ac7f39ebf6e305c6c84559934421526a7451be2a6d16851f799a628: Status 404 returned error can't find the container with id 6a0eeea84ac7f39ebf6e305c6c84559934421526a7451be2a6d16851f799a628 Mar 18 15:47:50 crc kubenswrapper[4792]: I0318 15:47:50.087941 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dbbd4"] Mar 18 15:47:50 crc kubenswrapper[4792]: I0318 15:47:50.097605 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-k4xld"] Mar 18 15:47:50 crc kubenswrapper[4792]: I0318 15:47:50.253933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" event={"ID":"7d4badb4-1388-47c8-aed9-f8478388af41","Type":"ContainerStarted","Data":"554ad30b3e186821335ad34eb86d87c9d6936c41840211ef2aae93270fb9e0ec"} Mar 18 15:47:50 crc kubenswrapper[4792]: I0318 15:47:50.254932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" event={"ID":"4e1f9db4-54f6-4217-a3cb-e9ea440f186e","Type":"ContainerStarted","Data":"e8fba33c79bb8abc0282e3728e3a89278bc267fe4e21984d0488743a931968bd"} Mar 18 15:47:50 crc kubenswrapper[4792]: I0318 15:47:50.255894 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-k4xld" event={"ID":"4766d1f1-5397-48cd-8429-daa6bc26a860","Type":"ContainerStarted","Data":"6a0eeea84ac7f39ebf6e305c6c84559934421526a7451be2a6d16851f799a628"} Mar 18 15:47:51 crc kubenswrapper[4792]: I0318 15:47:51.765176 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 15:47:52 crc kubenswrapper[4792]: I0318 15:47:52.289521 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" event={"ID":"4e1f9db4-54f6-4217-a3cb-e9ea440f186e","Type":"ContainerStarted","Data":"924e17ecfe13ed1e1744ec500d762ef51caa85f9ccd4c5ab3ed63b08ab6afbc4"} Mar 18 15:47:54 crc kubenswrapper[4792]: I0318 15:47:54.303904 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" event={"ID":"7d4badb4-1388-47c8-aed9-f8478388af41","Type":"ContainerStarted","Data":"37f0f4522a538a3d9c68509362dfe7d4698d78a02667f7e6b88511b106010552"} Mar 18 15:47:54 crc kubenswrapper[4792]: I0318 15:47:54.304297 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" Mar 18 15:47:54 crc kubenswrapper[4792]: I0318 15:47:54.306342 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-k4xld" event={"ID":"4766d1f1-5397-48cd-8429-daa6bc26a860","Type":"ContainerStarted","Data":"dd8837dd202303fa2e7fddaed675a1c363e3b3e6ce806b1f7cd2cb5afffa17e4"} Mar 18 15:47:54 crc kubenswrapper[4792]: I0318 15:47:54.324784 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqww9" podStartSLOduration=3.096134724 podStartE2EDuration="5.324764793s" podCreationTimestamp="2026-03-18 15:47:49 +0000 UTC" firstStartedPulling="2026-03-18 15:47:49.793183026 +0000 UTC m=+818.662511973" lastFinishedPulling="2026-03-18 15:47:52.021813105 +0000 UTC m=+820.891142042" observedRunningTime="2026-03-18 15:47:52.335219093 +0000 UTC m=+821.204548040" watchObservedRunningTime="2026-03-18 15:47:54.324764793 +0000 UTC m=+823.194093730" Mar 18 15:47:54 crc kubenswrapper[4792]: I0318 15:47:54.326287 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" podStartSLOduration=1.452293164 podStartE2EDuration="5.326278266s" podCreationTimestamp="2026-03-18 15:47:49 +0000 UTC" firstStartedPulling="2026-03-18 15:47:50.087024128 +0000 UTC m=+818.956353105" lastFinishedPulling="2026-03-18 15:47:53.96100927 +0000 UTC m=+822.830338207" observedRunningTime="2026-03-18 15:47:54.319919683 +0000 UTC m=+823.189248630" watchObservedRunningTime="2026-03-18 15:47:54.326278266 +0000 UTC m=+823.195607203" Mar 18 15:47:54 crc kubenswrapper[4792]: I0318 15:47:54.337174 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-k4xld" podStartSLOduration=1.471803709 podStartE2EDuration="5.337152441s" podCreationTimestamp="2026-03-18 15:47:49 +0000 UTC" firstStartedPulling="2026-03-18 15:47:50.090126548 +0000 UTC m=+818.959455495" lastFinishedPulling="2026-03-18 15:47:53.9554753 +0000 UTC m=+822.824804227" observedRunningTime="2026-03-18 15:47:54.334549585 +0000 UTC m=+823.203878522" watchObservedRunningTime="2026-03-18 15:47:54.337152441 +0000 UTC m=+823.206481378" Mar 18 15:47:59 crc kubenswrapper[4792]: I0318 15:47:59.545453 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.142124 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564148-j8krc"] Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.143648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-j8krc" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.145955 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.147100 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-j8krc"] Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.148260 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.148529 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.241172 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnswg\" (UniqueName: \"kubernetes.io/projected/d4d86a35-8172-4d88-bcd3-06612a005ddc-kube-api-access-tnswg\") pod \"auto-csr-approver-29564148-j8krc\" (UID: \"d4d86a35-8172-4d88-bcd3-06612a005ddc\") " pod="openshift-infra/auto-csr-approver-29564148-j8krc" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.342665 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnswg\" (UniqueName: \"kubernetes.io/projected/d4d86a35-8172-4d88-bcd3-06612a005ddc-kube-api-access-tnswg\") pod \"auto-csr-approver-29564148-j8krc\" (UID: \"d4d86a35-8172-4d88-bcd3-06612a005ddc\") " pod="openshift-infra/auto-csr-approver-29564148-j8krc" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.361468 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnswg\" (UniqueName: \"kubernetes.io/projected/d4d86a35-8172-4d88-bcd3-06612a005ddc-kube-api-access-tnswg\") pod \"auto-csr-approver-29564148-j8krc\" (UID: \"d4d86a35-8172-4d88-bcd3-06612a005ddc\") " pod="openshift-infra/auto-csr-approver-29564148-j8krc" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.461240 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-j8krc" Mar 18 15:48:00 crc kubenswrapper[4792]: I0318 15:48:00.867141 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-j8krc"] Mar 18 15:48:01 crc kubenswrapper[4792]: I0318 15:48:01.350205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-j8krc" event={"ID":"d4d86a35-8172-4d88-bcd3-06612a005ddc","Type":"ContainerStarted","Data":"2483cfa34d5b1dbd7027344d0bb762bd5e02ff7ca7e83c1cb3e6ec05b63db404"} Mar 18 15:48:02 crc kubenswrapper[4792]: I0318 15:48:02.358648 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4d86a35-8172-4d88-bcd3-06612a005ddc" containerID="b02b657f2dc9309556d5da3fb8f3ac436e35756ad8ecdba90343ec70151242c2" exitCode=0 Mar 18 15:48:02 crc kubenswrapper[4792]: I0318 15:48:02.358696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-j8krc" event={"ID":"d4d86a35-8172-4d88-bcd3-06612a005ddc","Type":"ContainerDied","Data":"b02b657f2dc9309556d5da3fb8f3ac436e35756ad8ecdba90343ec70151242c2"} Mar 18 15:48:03 crc kubenswrapper[4792]: I0318 15:48:03.595948 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-j8krc" Mar 18 15:48:03 crc kubenswrapper[4792]: I0318 15:48:03.680691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnswg\" (UniqueName: \"kubernetes.io/projected/d4d86a35-8172-4d88-bcd3-06612a005ddc-kube-api-access-tnswg\") pod \"d4d86a35-8172-4d88-bcd3-06612a005ddc\" (UID: \"d4d86a35-8172-4d88-bcd3-06612a005ddc\") " Mar 18 15:48:03 crc kubenswrapper[4792]: I0318 15:48:03.685342 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d86a35-8172-4d88-bcd3-06612a005ddc-kube-api-access-tnswg" (OuterVolumeSpecName: "kube-api-access-tnswg") pod "d4d86a35-8172-4d88-bcd3-06612a005ddc" (UID: "d4d86a35-8172-4d88-bcd3-06612a005ddc"). InnerVolumeSpecName "kube-api-access-tnswg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:48:03 crc kubenswrapper[4792]: I0318 15:48:03.782831 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnswg\" (UniqueName: \"kubernetes.io/projected/d4d86a35-8172-4d88-bcd3-06612a005ddc-kube-api-access-tnswg\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:04 crc kubenswrapper[4792]: I0318 15:48:04.372944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564148-j8krc" event={"ID":"d4d86a35-8172-4d88-bcd3-06612a005ddc","Type":"ContainerDied","Data":"2483cfa34d5b1dbd7027344d0bb762bd5e02ff7ca7e83c1cb3e6ec05b63db404"} Mar 18 15:48:04 crc kubenswrapper[4792]: I0318 15:48:04.372996 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2483cfa34d5b1dbd7027344d0bb762bd5e02ff7ca7e83c1cb3e6ec05b63db404" Mar 18 15:48:04 crc kubenswrapper[4792]: I0318 15:48:04.373015 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564148-j8krc" Mar 18 15:48:04 crc kubenswrapper[4792]: I0318 15:48:04.650895 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-887f6"] Mar 18 15:48:04 crc kubenswrapper[4792]: I0318 15:48:04.656223 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564142-887f6"] Mar 18 15:48:05 crc kubenswrapper[4792]: I0318 15:48:05.864053 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da70f823-e9c1-4847-855e-5ecd2db92e8d" path="/var/lib/kubelet/pods/da70f823-e9c1-4847-855e-5ecd2db92e8d/volumes" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.409125 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7"] Mar 18 15:48:21 crc kubenswrapper[4792]: E0318 15:48:21.409912 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d86a35-8172-4d88-bcd3-06612a005ddc" containerName="oc" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.409925 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d86a35-8172-4d88-bcd3-06612a005ddc" containerName="oc" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.410073 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d86a35-8172-4d88-bcd3-06612a005ddc" containerName="oc" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.410934 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.416550 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.429654 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7"] Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.596263 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t"] Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.597520 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.599495 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.599610 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8p7\" (UniqueName: \"kubernetes.io/projected/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-kube-api-access-qj8p7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.599686 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.599720 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgsr\" (UniqueName: \"kubernetes.io/projected/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-kube-api-access-pwgsr\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.599748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.599792 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.617245 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t"] Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.701824 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.701945 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8p7\" (UniqueName: \"kubernetes.io/projected/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-kube-api-access-qj8p7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.702023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.702061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgsr\" (UniqueName: \"kubernetes.io/projected/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-kube-api-access-pwgsr\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.702092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.702125 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.703036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.703715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.703780 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.703886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.723422 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8p7\" (UniqueName: \"kubernetes.io/projected/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-kube-api-access-qj8p7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.740225 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.743699 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgsr\" (UniqueName: \"kubernetes.io/projected/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-kube-api-access-pwgsr\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:21 crc kubenswrapper[4792]: I0318 15:48:21.912105 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:22 crc kubenswrapper[4792]: I0318 15:48:22.141583 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7"] Mar 18 15:48:22 crc kubenswrapper[4792]: W0318 15:48:22.378689 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3245cd7d_9b25_4016_a86b_44e81a9e2fb5.slice/crio-eab2df1d5aa2094615377119192b08040b99619f09af009fb5efe66f00ea9423 WatchSource:0}: Error finding container eab2df1d5aa2094615377119192b08040b99619f09af009fb5efe66f00ea9423: Status 404 returned error can't find the container with id eab2df1d5aa2094615377119192b08040b99619f09af009fb5efe66f00ea9423 Mar 18 15:48:22 crc kubenswrapper[4792]: I0318 15:48:22.380849 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t"] Mar 18 15:48:22 crc kubenswrapper[4792]: I0318 15:48:22.487476 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" event={"ID":"3245cd7d-9b25-4016-a86b-44e81a9e2fb5","Type":"ContainerStarted","Data":"eab2df1d5aa2094615377119192b08040b99619f09af009fb5efe66f00ea9423"} Mar 18 15:48:22 crc kubenswrapper[4792]: I0318 15:48:22.489020 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" event={"ID":"e0ba453e-25c7-4e11-a393-74a6a4ee6e56","Type":"ContainerStarted","Data":"37823f567086396b18c96f820637b75e528878aaa12ea16828e0caae93a0c972"} Mar 18 15:48:22 crc kubenswrapper[4792]: I0318 15:48:22.489059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" event={"ID":"e0ba453e-25c7-4e11-a393-74a6a4ee6e56","Type":"ContainerStarted","Data":"2f6c99193dfdabc508c36dca51f275c913a496574a080cebc565d7b15d1fc01c"} Mar 18 15:48:23 crc kubenswrapper[4792]: I0318 15:48:23.496498 4792 generic.go:334] "Generic (PLEG): container finished" podID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerID="00be8a4cef13bcbcd87f8d17f7baa91515b1a98e6549eb93a901a1b9e48a8db2" exitCode=0 Mar 18 15:48:23 crc kubenswrapper[4792]: I0318 15:48:23.496589 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" event={"ID":"3245cd7d-9b25-4016-a86b-44e81a9e2fb5","Type":"ContainerDied","Data":"00be8a4cef13bcbcd87f8d17f7baa91515b1a98e6549eb93a901a1b9e48a8db2"} Mar 18 15:48:23 crc kubenswrapper[4792]: I0318 15:48:23.498388 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerID="37823f567086396b18c96f820637b75e528878aaa12ea16828e0caae93a0c972" exitCode=0 Mar 18 15:48:23 crc kubenswrapper[4792]: I0318 15:48:23.498413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" event={"ID":"e0ba453e-25c7-4e11-a393-74a6a4ee6e56","Type":"ContainerDied","Data":"37823f567086396b18c96f820637b75e528878aaa12ea16828e0caae93a0c972"} Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.156900 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9p74c"] Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.166698 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.170640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9p74c"] Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.188038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pz8t\" (UniqueName: \"kubernetes.io/projected/762cc279-4a4f-445a-9406-b7b89d26287a-kube-api-access-5pz8t\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.188119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-catalog-content\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.188188 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-utilities\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.289308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-utilities\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.289698 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pz8t\" (UniqueName: \"kubernetes.io/projected/762cc279-4a4f-445a-9406-b7b89d26287a-kube-api-access-5pz8t\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.289750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-catalog-content\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.289928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-utilities\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.290235 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-catalog-content\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.309899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pz8t\" (UniqueName: \"kubernetes.io/projected/762cc279-4a4f-445a-9406-b7b89d26287a-kube-api-access-5pz8t\") pod \"redhat-operators-9p74c\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.513564 4792 generic.go:334] "Generic (PLEG): container finished" podID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerID="01f37d4e05fac2ac6962feefd51e15f132106b8e64ab757f686e4580c03eab78" exitCode=0 Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.513632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" event={"ID":"3245cd7d-9b25-4016-a86b-44e81a9e2fb5","Type":"ContainerDied","Data":"01f37d4e05fac2ac6962feefd51e15f132106b8e64ab757f686e4580c03eab78"} Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.514549 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.517240 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerID="01105a52535931fc729fa7be3fe60b6f0996395503940b92fafdeba17419cfe5" exitCode=0 Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.517311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" event={"ID":"e0ba453e-25c7-4e11-a393-74a6a4ee6e56","Type":"ContainerDied","Data":"01105a52535931fc729fa7be3fe60b6f0996395503940b92fafdeba17419cfe5"} Mar 18 15:48:25 crc kubenswrapper[4792]: I0318 15:48:25.776142 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9p74c"] Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.140107 4792 scope.go:117] "RemoveContainer" containerID="286cc85541ebae6e9dd07424b9158d547c2f534f5ae5d8725d31601b65cbc6bc" Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.524156 4792 generic.go:334] "Generic (PLEG): container finished" podID="762cc279-4a4f-445a-9406-b7b89d26287a" containerID="a0790bc29fd6f0c2992e129e1e872dc1b0c0158d2ff7a1de623a34aed3621b37" exitCode=0 Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.524238 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p74c" event={"ID":"762cc279-4a4f-445a-9406-b7b89d26287a","Type":"ContainerDied","Data":"a0790bc29fd6f0c2992e129e1e872dc1b0c0158d2ff7a1de623a34aed3621b37"} Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.524268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p74c" event={"ID":"762cc279-4a4f-445a-9406-b7b89d26287a","Type":"ContainerStarted","Data":"6a9369aa300121a0ab844fd2ccde4ab8314711f18ad69bf5973644b7adacd5f3"} Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.526171 4792 generic.go:334] "Generic (PLEG): container finished" podID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerID="6af9f9429b81fc030cc088fd08406b0abf28047efa2ea0ddcbb5e996aa1425c1" exitCode=0 Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.526215 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" event={"ID":"3245cd7d-9b25-4016-a86b-44e81a9e2fb5","Type":"ContainerDied","Data":"6af9f9429b81fc030cc088fd08406b0abf28047efa2ea0ddcbb5e996aa1425c1"} Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.528282 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerID="b67f1a6700993c318c06e1167878fe3c5781be9a771c44ffca5007c083f2475f" exitCode=0 Mar 18 15:48:26 crc kubenswrapper[4792]: I0318 15:48:26.528315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" event={"ID":"e0ba453e-25c7-4e11-a393-74a6a4ee6e56","Type":"ContainerDied","Data":"b67f1a6700993c318c06e1167878fe3c5781be9a771c44ffca5007c083f2475f"} Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.841997 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.850422 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.935818 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-bundle\") pod \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.936140 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj8p7\" (UniqueName: \"kubernetes.io/projected/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-kube-api-access-qj8p7\") pod \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.936234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-util\") pod \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\" (UID: \"e0ba453e-25c7-4e11-a393-74a6a4ee6e56\") " Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.936317 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-util\") pod \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.936393 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwgsr\" (UniqueName: \"kubernetes.io/projected/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-kube-api-access-pwgsr\") pod \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.936546 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-bundle\") pod \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\" (UID: \"3245cd7d-9b25-4016-a86b-44e81a9e2fb5\") " Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.936780 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-bundle" (OuterVolumeSpecName: "bundle") pod "e0ba453e-25c7-4e11-a393-74a6a4ee6e56" (UID: "e0ba453e-25c7-4e11-a393-74a6a4ee6e56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.937256 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.938151 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-bundle" (OuterVolumeSpecName: "bundle") pod "3245cd7d-9b25-4016-a86b-44e81a9e2fb5" (UID: "3245cd7d-9b25-4016-a86b-44e81a9e2fb5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.942252 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-kube-api-access-pwgsr" (OuterVolumeSpecName: "kube-api-access-pwgsr") pod "3245cd7d-9b25-4016-a86b-44e81a9e2fb5" (UID: "3245cd7d-9b25-4016-a86b-44e81a9e2fb5"). InnerVolumeSpecName "kube-api-access-pwgsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:48:27 crc kubenswrapper[4792]: I0318 15:48:27.951658 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-kube-api-access-qj8p7" (OuterVolumeSpecName: "kube-api-access-qj8p7") pod "e0ba453e-25c7-4e11-a393-74a6a4ee6e56" (UID: "e0ba453e-25c7-4e11-a393-74a6a4ee6e56"). InnerVolumeSpecName "kube-api-access-qj8p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.038899 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwgsr\" (UniqueName: \"kubernetes.io/projected/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-kube-api-access-pwgsr\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.038943 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.038953 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj8p7\" (UniqueName: \"kubernetes.io/projected/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-kube-api-access-qj8p7\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.474798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-util" (OuterVolumeSpecName: "util") pod "3245cd7d-9b25-4016-a86b-44e81a9e2fb5" (UID: "3245cd7d-9b25-4016-a86b-44e81a9e2fb5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.545364 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3245cd7d-9b25-4016-a86b-44e81a9e2fb5-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.546697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" event={"ID":"3245cd7d-9b25-4016-a86b-44e81a9e2fb5","Type":"ContainerDied","Data":"eab2df1d5aa2094615377119192b08040b99619f09af009fb5efe66f00ea9423"} Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.546735 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab2df1d5aa2094615377119192b08040b99619f09af009fb5efe66f00ea9423" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.546751 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.549291 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" event={"ID":"e0ba453e-25c7-4e11-a393-74a6a4ee6e56","Type":"ContainerDied","Data":"2f6c99193dfdabc508c36dca51f275c913a496574a080cebc565d7b15d1fc01c"} Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.549334 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6c99193dfdabc508c36dca51f275c913a496574a080cebc565d7b15d1fc01c" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.549357 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.654004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-util" (OuterVolumeSpecName: "util") pod "e0ba453e-25c7-4e11-a393-74a6a4ee6e56" (UID: "e0ba453e-25c7-4e11-a393-74a6a4ee6e56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:48:28 crc kubenswrapper[4792]: I0318 15:48:28.748000 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0ba453e-25c7-4e11-a393-74a6a4ee6e56-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:31 crc kubenswrapper[4792]: I0318 15:48:31.573553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p74c" event={"ID":"762cc279-4a4f-445a-9406-b7b89d26287a","Type":"ContainerStarted","Data":"f34d91bf4351bc890200f6e2bc45739b4baa00aceee4819f6aee5e5334c2438c"} Mar 18 15:48:32 crc kubenswrapper[4792]: I0318 15:48:32.580923 4792 generic.go:334] "Generic (PLEG): container finished" podID="762cc279-4a4f-445a-9406-b7b89d26287a" containerID="f34d91bf4351bc890200f6e2bc45739b4baa00aceee4819f6aee5e5334c2438c" exitCode=0 Mar 18 15:48:32 crc kubenswrapper[4792]: I0318 15:48:32.580992 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p74c" event={"ID":"762cc279-4a4f-445a-9406-b7b89d26287a","Type":"ContainerDied","Data":"f34d91bf4351bc890200f6e2bc45739b4baa00aceee4819f6aee5e5334c2438c"} Mar 18 15:48:33 crc kubenswrapper[4792]: I0318 15:48:33.588055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p74c" event={"ID":"762cc279-4a4f-445a-9406-b7b89d26287a","Type":"ContainerStarted","Data":"33fe4bbcdc02d112d28624f2ce3ea919ef94c04bce958c3e112e0e5573568d55"} Mar 18 15:48:33 crc kubenswrapper[4792]: I0318 15:48:33.653719 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9p74c" podStartSLOduration=2.166808319 podStartE2EDuration="8.653699873s" podCreationTimestamp="2026-03-18 15:48:25 +0000 UTC" firstStartedPulling="2026-03-18 15:48:26.527483355 +0000 UTC m=+855.396812292" lastFinishedPulling="2026-03-18 15:48:33.014374909 +0000 UTC m=+861.883703846" observedRunningTime="2026-03-18 15:48:33.648694042 +0000 UTC m=+862.518022979" watchObservedRunningTime="2026-03-18 15:48:33.653699873 +0000 UTC m=+862.523028810" Mar 18 15:48:35 crc kubenswrapper[4792]: I0318 15:48:35.515680 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:35 crc kubenswrapper[4792]: I0318 15:48:35.515837 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:36 crc kubenswrapper[4792]: I0318 15:48:36.557109 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9p74c" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="registry-server" probeResult="failure" output=< Mar 18 15:48:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 15:48:36 crc kubenswrapper[4792]: > Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032063 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt"] Mar 18 15:48:37 crc kubenswrapper[4792]: E0318 15:48:37.032322 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerName="extract" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032336 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerName="extract" Mar 18 15:48:37 crc kubenswrapper[4792]: E0318 15:48:37.032353 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerName="pull" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032362 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerName="pull" Mar 18 15:48:37 crc kubenswrapper[4792]: E0318 15:48:37.032371 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerName="pull" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032378 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerName="pull" Mar 18 15:48:37 crc kubenswrapper[4792]: E0318 15:48:37.032389 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerName="util" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032395 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerName="util" Mar 18 15:48:37 crc kubenswrapper[4792]: E0318 15:48:37.032402 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerName="extract" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032410 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerName="extract" Mar 18 15:48:37 crc kubenswrapper[4792]: E0318 15:48:37.032423 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerName="util" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032431 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerName="util" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032572 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ba453e-25c7-4e11-a393-74a6a4ee6e56" containerName="extract" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.032586 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3245cd7d-9b25-4016-a86b-44e81a9e2fb5" containerName="extract" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.033090 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.035429 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.035652 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-2g7rg" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.036335 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.050799 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt"] Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.171848 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp42t\" (UniqueName: \"kubernetes.io/projected/3ff565a6-f95c-4656-be6c-cd52028bc42d-kube-api-access-cp42t\") pod \"cluster-logging-operator-66689c4bbf-zx4wt\" (UID: \"3ff565a6-f95c-4656-be6c-cd52028bc42d\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.272873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp42t\" (UniqueName: \"kubernetes.io/projected/3ff565a6-f95c-4656-be6c-cd52028bc42d-kube-api-access-cp42t\") pod \"cluster-logging-operator-66689c4bbf-zx4wt\" (UID: \"3ff565a6-f95c-4656-be6c-cd52028bc42d\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.293857 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp42t\" (UniqueName: \"kubernetes.io/projected/3ff565a6-f95c-4656-be6c-cd52028bc42d-kube-api-access-cp42t\") pod \"cluster-logging-operator-66689c4bbf-zx4wt\" (UID: \"3ff565a6-f95c-4656-be6c-cd52028bc42d\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.347154 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" Mar 18 15:48:37 crc kubenswrapper[4792]: I0318 15:48:37.618275 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt"] Mar 18 15:48:38 crc kubenswrapper[4792]: I0318 15:48:38.616771 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" event={"ID":"3ff565a6-f95c-4656-be6c-cd52028bc42d","Type":"ContainerStarted","Data":"f1f37d6fdf25eea3a3323a0fbf12679d2c661f6663fea7adc9e14ae382e36b25"} Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.188749 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4"] Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.190400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.193062 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.198662 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.198668 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.198685 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-9d289" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.199299 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.199499 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.212561 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4"] Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.269924 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.269993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-apiservice-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.270229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2c438c99-c0c4-43ec-a5e7-33a18425e63f-manager-config\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.270280 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx7lm\" (UniqueName: \"kubernetes.io/projected/2c438c99-c0c4-43ec-a5e7-33a18425e63f-kube-api-access-sx7lm\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.270313 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-webhook-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.371185 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2c438c99-c0c4-43ec-a5e7-33a18425e63f-manager-config\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.371229 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx7lm\" (UniqueName: \"kubernetes.io/projected/2c438c99-c0c4-43ec-a5e7-33a18425e63f-kube-api-access-sx7lm\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.371253 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-webhook-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.371292 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.371320 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-apiservice-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.372345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2c438c99-c0c4-43ec-a5e7-33a18425e63f-manager-config\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.377747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-apiservice-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.383551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.383665 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c438c99-c0c4-43ec-a5e7-33a18425e63f-webhook-cert\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.395707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx7lm\" (UniqueName: \"kubernetes.io/projected/2c438c99-c0c4-43ec-a5e7-33a18425e63f-kube-api-access-sx7lm\") pod \"loki-operator-controller-manager-7c8cdd9f9f-lv5d4\" (UID: \"2c438c99-c0c4-43ec-a5e7-33a18425e63f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:43 crc kubenswrapper[4792]: I0318 15:48:43.516239 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:45 crc kubenswrapper[4792]: I0318 15:48:45.569811 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:45 crc kubenswrapper[4792]: I0318 15:48:45.644561 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:45 crc kubenswrapper[4792]: I0318 15:48:45.673821 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" event={"ID":"3ff565a6-f95c-4656-be6c-cd52028bc42d","Type":"ContainerStarted","Data":"2ad40541d9794eaffc601fdb2d5f55d766060767a3eaca73ccf2a8a2d0647e22"} Mar 18 15:48:45 crc kubenswrapper[4792]: I0318 15:48:45.690438 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-zx4wt" podStartSLOduration=0.955879415 podStartE2EDuration="8.69041526s" podCreationTimestamp="2026-03-18 15:48:37 +0000 UTC" firstStartedPulling="2026-03-18 15:48:37.622875625 +0000 UTC m=+866.492204562" lastFinishedPulling="2026-03-18 15:48:45.35741147 +0000 UTC m=+874.226740407" observedRunningTime="2026-03-18 15:48:45.689412609 +0000 UTC m=+874.558741556" watchObservedRunningTime="2026-03-18 15:48:45.69041526 +0000 UTC m=+874.559744207" Mar 18 15:48:45 crc kubenswrapper[4792]: W0318 15:48:45.747861 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c438c99_c0c4_43ec_a5e7_33a18425e63f.slice/crio-d108a621fc70a9d6ed46eb4cfb96963db90f3674cd407b1a996c776d31603232 WatchSource:0}: Error finding container d108a621fc70a9d6ed46eb4cfb96963db90f3674cd407b1a996c776d31603232: Status 404 returned error can't find the container with id d108a621fc70a9d6ed46eb4cfb96963db90f3674cd407b1a996c776d31603232 Mar 18 15:48:45 crc kubenswrapper[4792]: I0318 15:48:45.748800 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4"] Mar 18 15:48:46 crc kubenswrapper[4792]: I0318 15:48:46.689161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" event={"ID":"2c438c99-c0c4-43ec-a5e7-33a18425e63f","Type":"ContainerStarted","Data":"d108a621fc70a9d6ed46eb4cfb96963db90f3674cd407b1a996c776d31603232"} Mar 18 15:48:48 crc kubenswrapper[4792]: I0318 15:48:48.947958 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9p74c"] Mar 18 15:48:48 crc kubenswrapper[4792]: I0318 15:48:48.948782 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9p74c" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="registry-server" containerID="cri-o://33fe4bbcdc02d112d28624f2ce3ea919ef94c04bce958c3e112e0e5573568d55" gracePeriod=2 Mar 18 15:48:49 crc kubenswrapper[4792]: I0318 15:48:49.708937 4792 generic.go:334] "Generic (PLEG): container finished" podID="762cc279-4a4f-445a-9406-b7b89d26287a" containerID="33fe4bbcdc02d112d28624f2ce3ea919ef94c04bce958c3e112e0e5573568d55" exitCode=0 Mar 18 15:48:49 crc kubenswrapper[4792]: I0318 15:48:49.709006 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p74c" event={"ID":"762cc279-4a4f-445a-9406-b7b89d26287a","Type":"ContainerDied","Data":"33fe4bbcdc02d112d28624f2ce3ea919ef94c04bce958c3e112e0e5573568d55"} Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.053960 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.192093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-utilities\") pod \"762cc279-4a4f-445a-9406-b7b89d26287a\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.192191 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pz8t\" (UniqueName: \"kubernetes.io/projected/762cc279-4a4f-445a-9406-b7b89d26287a-kube-api-access-5pz8t\") pod \"762cc279-4a4f-445a-9406-b7b89d26287a\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.192238 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-catalog-content\") pod \"762cc279-4a4f-445a-9406-b7b89d26287a\" (UID: \"762cc279-4a4f-445a-9406-b7b89d26287a\") " Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.194007 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-utilities" (OuterVolumeSpecName: "utilities") pod "762cc279-4a4f-445a-9406-b7b89d26287a" (UID: "762cc279-4a4f-445a-9406-b7b89d26287a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.196756 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762cc279-4a4f-445a-9406-b7b89d26287a-kube-api-access-5pz8t" (OuterVolumeSpecName: "kube-api-access-5pz8t") pod "762cc279-4a4f-445a-9406-b7b89d26287a" (UID: "762cc279-4a4f-445a-9406-b7b89d26287a"). InnerVolumeSpecName "kube-api-access-5pz8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.294424 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.294465 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pz8t\" (UniqueName: \"kubernetes.io/projected/762cc279-4a4f-445a-9406-b7b89d26287a-kube-api-access-5pz8t\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.324659 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "762cc279-4a4f-445a-9406-b7b89d26287a" (UID: "762cc279-4a4f-445a-9406-b7b89d26287a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.396128 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762cc279-4a4f-445a-9406-b7b89d26287a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.721577 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" event={"ID":"2c438c99-c0c4-43ec-a5e7-33a18425e63f","Type":"ContainerStarted","Data":"7b1746bbd148f1baaca4c85d65c6817269b56d83496fb8702869ce33e38412cc"} Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.723794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p74c" event={"ID":"762cc279-4a4f-445a-9406-b7b89d26287a","Type":"ContainerDied","Data":"6a9369aa300121a0ab844fd2ccde4ab8314711f18ad69bf5973644b7adacd5f3"} Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.723831 4792 scope.go:117] "RemoveContainer" containerID="33fe4bbcdc02d112d28624f2ce3ea919ef94c04bce958c3e112e0e5573568d55" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.723840 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p74c" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.739704 4792 scope.go:117] "RemoveContainer" containerID="f34d91bf4351bc890200f6e2bc45739b4baa00aceee4819f6aee5e5334c2438c" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.755464 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9p74c"] Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.763177 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9p74c"] Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.767057 4792 scope.go:117] "RemoveContainer" containerID="a0790bc29fd6f0c2992e129e1e872dc1b0c0158d2ff7a1de623a34aed3621b37" Mar 18 15:48:51 crc kubenswrapper[4792]: I0318 15:48:51.865401 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" path="/var/lib/kubelet/pods/762cc279-4a4f-445a-9406-b7b89d26287a/volumes" Mar 18 15:48:57 crc kubenswrapper[4792]: I0318 15:48:57.768888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" event={"ID":"2c438c99-c0c4-43ec-a5e7-33a18425e63f","Type":"ContainerStarted","Data":"92b099cd56f0bc7142f2409dd4fc122d77310cc3d14290f547fd1a8c5a89cdf8"} Mar 18 15:48:57 crc kubenswrapper[4792]: I0318 15:48:57.769494 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:57 crc kubenswrapper[4792]: I0318 15:48:57.771011 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 15:48:57 crc kubenswrapper[4792]: I0318 15:48:57.790601 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" podStartSLOduration=3.022010396 podStartE2EDuration="14.790585258s" podCreationTimestamp="2026-03-18 15:48:43 +0000 UTC" firstStartedPulling="2026-03-18 15:48:45.750279054 +0000 UTC m=+874.619607991" lastFinishedPulling="2026-03-18 15:48:57.518853916 +0000 UTC m=+886.388182853" observedRunningTime="2026-03-18 15:48:57.787203819 +0000 UTC m=+886.656532756" watchObservedRunningTime="2026-03-18 15:48:57.790585258 +0000 UTC m=+886.659914195" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.952130 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 18 15:49:02 crc kubenswrapper[4792]: E0318 15:49:02.952897 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="extract-utilities" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.952911 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="extract-utilities" Mar 18 15:49:02 crc kubenswrapper[4792]: E0318 15:49:02.952921 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="extract-content" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.952928 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="extract-content" Mar 18 15:49:02 crc kubenswrapper[4792]: E0318 15:49:02.952944 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="registry-server" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.952950 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="registry-server" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.953109 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="762cc279-4a4f-445a-9406-b7b89d26287a" containerName="registry-server" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.953613 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.955702 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.956009 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 18 15:49:02 crc kubenswrapper[4792]: I0318 15:49:02.966843 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.085554 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\") pod \"minio\" (UID: \"81a99c02-e20b-4945-b2fb-13fe3435311f\") " pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.085607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qww\" (UniqueName: \"kubernetes.io/projected/81a99c02-e20b-4945-b2fb-13fe3435311f-kube-api-access-t8qww\") pod \"minio\" (UID: \"81a99c02-e20b-4945-b2fb-13fe3435311f\") " pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.186998 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\") pod \"minio\" (UID: \"81a99c02-e20b-4945-b2fb-13fe3435311f\") " pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.187046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qww\" (UniqueName: \"kubernetes.io/projected/81a99c02-e20b-4945-b2fb-13fe3435311f-kube-api-access-t8qww\") pod \"minio\" (UID: \"81a99c02-e20b-4945-b2fb-13fe3435311f\") " pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.191500 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.191543 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\") pod \"minio\" (UID: \"81a99c02-e20b-4945-b2fb-13fe3435311f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/996bb4368a8e2077b388180845ea5bd970c36fa4b864b5823f5d9ee387961d35/globalmount\"" pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.212169 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qww\" (UniqueName: \"kubernetes.io/projected/81a99c02-e20b-4945-b2fb-13fe3435311f-kube-api-access-t8qww\") pod \"minio\" (UID: \"81a99c02-e20b-4945-b2fb-13fe3435311f\") " pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.228428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1832a185-b19d-49f3-a7af-703ccbba26e5\") pod \"minio\" (UID: \"81a99c02-e20b-4945-b2fb-13fe3435311f\") " pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.273470 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.717662 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.725228 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:49:03 crc kubenswrapper[4792]: I0318 15:49:03.814034 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"81a99c02-e20b-4945-b2fb-13fe3435311f","Type":"ContainerStarted","Data":"591b5a0d963df9fba99ba30ec9309eabe80020ab07743930c13d3c409bd92529"} Mar 18 15:49:06 crc kubenswrapper[4792]: I0318 15:49:06.836357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"81a99c02-e20b-4945-b2fb-13fe3435311f","Type":"ContainerStarted","Data":"c8e3d8211df1417312c487a5395bae7e86dae3f27df89d22361c36d716537526"} Mar 18 15:49:06 crc kubenswrapper[4792]: I0318 15:49:06.854871 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.954923388 podStartE2EDuration="6.854853042s" podCreationTimestamp="2026-03-18 15:49:00 +0000 UTC" firstStartedPulling="2026-03-18 15:49:03.725198286 +0000 UTC m=+892.594527223" lastFinishedPulling="2026-03-18 15:49:06.62512794 +0000 UTC m=+895.494456877" observedRunningTime="2026-03-18 15:49:06.848351973 +0000 UTC m=+895.717680930" watchObservedRunningTime="2026-03-18 15:49:06.854853042 +0000 UTC m=+895.724181979" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.155530 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.157064 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.163655 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.163889 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-fr5tr" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.164023 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.164314 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.164364 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.166394 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.252655 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.252718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gkcs\" (UniqueName: \"kubernetes.io/projected/1c367fec-09d4-46fa-8900-0c508ced5de9-kube-api-access-5gkcs\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.252772 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.252815 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.252910 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c367fec-09d4-46fa-8900-0c508ced5de9-config\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.322395 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.323400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.331787 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.333084 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.333456 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.344967 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.354867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gkcs\" (UniqueName: \"kubernetes.io/projected/1c367fec-09d4-46fa-8900-0c508ced5de9-kube-api-access-5gkcs\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.354923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.355004 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.355037 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c367fec-09d4-46fa-8900-0c508ced5de9-config\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.355137 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.356205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.356432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c367fec-09d4-46fa-8900-0c508ced5de9-config\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.362950 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.369127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/1c367fec-09d4-46fa-8900-0c508ced5de9-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.401552 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gkcs\" (UniqueName: \"kubernetes.io/projected/1c367fec-09d4-46fa-8900-0c508ced5de9-kube-api-access-5gkcs\") pod \"logging-loki-distributor-9c6b6d984-dfcx2\" (UID: \"1c367fec-09d4-46fa-8900-0c508ced5de9\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.436375 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.437656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.439671 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.440473 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.440590 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.458578 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9dbb2aa-f06a-431d-b181-29315e9170cb-config\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.458685 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.458719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5v2q\" (UniqueName: \"kubernetes.io/projected/f9dbb2aa-f06a-431d-b181-29315e9170cb-kube-api-access-n5v2q\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.458854 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.458941 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.459146 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.476069 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.529344 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.532883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.540078 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.540373 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.540545 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-fdvzn" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.540681 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.540792 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.540901 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.551045 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.552523 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.560840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.560915 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.560941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9dbb2aa-f06a-431d-b181-29315e9170cb-config\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5v2q\" (UniqueName: \"kubernetes.io/projected/f9dbb2aa-f06a-431d-b181-29315e9170cb-kube-api-access-n5v2q\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561146 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561166 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skxl4\" (UniqueName: \"kubernetes.io/projected/8927cd79-8eff-4f53-a676-782cbb366e9c-kube-api-access-skxl4\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8927cd79-8eff-4f53-a676-782cbb366e9c-config\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.561236 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.568412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.569471 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9dbb2aa-f06a-431d-b181-29315e9170cb-config\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.590524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.590993 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.594453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f9dbb2aa-f06a-431d-b181-29315e9170cb-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.610097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5v2q\" (UniqueName: \"kubernetes.io/projected/f9dbb2aa-f06a-431d-b181-29315e9170cb-kube-api-access-n5v2q\") pod \"logging-loki-querier-6dcbdf8bb8-86w5q\" (UID: \"f9dbb2aa-f06a-431d-b181-29315e9170cb\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.610576 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.610621 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl"] Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.636783 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.662664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-rbac\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666217 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-rbac\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666317 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666507 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-tls-secret\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666540 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-tls-secret\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7mvj\" (UniqueName: \"kubernetes.io/projected/ff112f55-c823-4d01-a355-08279e6a0391-kube-api-access-z7mvj\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-lokistack-gateway\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666868 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.666934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skxl4\" (UniqueName: \"kubernetes.io/projected/8927cd79-8eff-4f53-a676-782cbb366e9c-kube-api-access-skxl4\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667012 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8927cd79-8eff-4f53-a676-782cbb366e9c-config\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667081 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667225 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667251 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-tenants\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667318 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-tenants\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667541 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-lokistack-gateway\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhl9d\" (UniqueName: \"kubernetes.io/projected/e0054d36-2f0d-43c8-93d2-774d775a22ea-kube-api-access-nhl9d\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.667649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.670992 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8927cd79-8eff-4f53-a676-782cbb366e9c-config\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.675370 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.678556 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.678613 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/8927cd79-8eff-4f53-a676-782cbb366e9c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.694612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skxl4\" (UniqueName: \"kubernetes.io/projected/8927cd79-8eff-4f53-a676-782cbb366e9c-kube-api-access-skxl4\") pod \"logging-loki-query-frontend-ff66c4dc9-gqm44\" (UID: \"8927cd79-8eff-4f53-a676-782cbb366e9c\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.766389 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.768799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-tenants\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.768849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-lokistack-gateway\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.768885 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhl9d\" (UniqueName: \"kubernetes.io/projected/e0054d36-2f0d-43c8-93d2-774d775a22ea-kube-api-access-nhl9d\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.768910 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.768935 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-rbac\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769193 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-rbac\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769234 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-tls-secret\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-tls-secret\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769301 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7mvj\" (UniqueName: \"kubernetes.io/projected/ff112f55-c823-4d01-a355-08279e6a0391-kube-api-access-z7mvj\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769326 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-lokistack-gateway\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769345 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769415 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.769479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-tenants\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.771240 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.771300 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.771400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.771459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-rbac\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.771777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-rbac\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.772144 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ff112f55-c823-4d01-a355-08279e6a0391-lokistack-gateway\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.772577 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-ca-bundle\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.774019 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.774157 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-tenants\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.774941 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-tenants\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.778175 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-tls-secret\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.778412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e0054d36-2f0d-43c8-93d2-774d775a22ea-lokistack-gateway\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.779763 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e0054d36-2f0d-43c8-93d2-774d775a22ea-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.795475 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7mvj\" (UniqueName: \"kubernetes.io/projected/ff112f55-c823-4d01-a355-08279e6a0391-kube-api-access-z7mvj\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.796366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ff112f55-c823-4d01-a355-08279e6a0391-tls-secret\") pod \"logging-loki-gateway-599d7cd94d-7f8hf\" (UID: \"ff112f55-c823-4d01-a355-08279e6a0391\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.797867 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhl9d\" (UniqueName: \"kubernetes.io/projected/e0054d36-2f0d-43c8-93d2-774d775a22ea-kube-api-access-nhl9d\") pod \"logging-loki-gateway-599d7cd94d-c8sjl\" (UID: \"e0054d36-2f0d-43c8-93d2-774d775a22ea\") " pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.924412 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:13 crc kubenswrapper[4792]: I0318 15:49:13.953722 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.062391 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.079291 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.184872 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.209051 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf"] Mar 18 15:49:14 crc kubenswrapper[4792]: W0318 15:49:14.217272 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff112f55_c823_4d01_a355_08279e6a0391.slice/crio-8f76dcaa5e41f8a99731998f57052cc1ccc6e38df8b4e0b4c55db26b32032de4 WatchSource:0}: Error finding container 8f76dcaa5e41f8a99731998f57052cc1ccc6e38df8b4e0b4c55db26b32032de4: Status 404 returned error can't find the container with id 8f76dcaa5e41f8a99731998f57052cc1ccc6e38df8b4e0b4c55db26b32032de4 Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.323512 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.325725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.327646 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.330341 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.331794 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-18682c74-0d19-45be-a095-30ed96badb39\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18682c74-0d19-45be-a095-30ed96badb39\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379335 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c59cfd-2add-4b4e-81c1-bacc77deae06-config\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379574 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4r6\" (UniqueName: \"kubernetes.io/projected/c6c59cfd-2add-4b4e-81c1-bacc77deae06-kube-api-access-8n4r6\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.379643 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.382144 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.383210 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.389077 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.394128 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.394128 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.465663 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.467148 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.469659 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.469787 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.475664 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480676 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltwf9\" (UniqueName: \"kubernetes.io/projected/d9a6fb1e-3b68-4210-9322-e13a634fac2a-kube-api-access-ltwf9\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480726 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480828 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4r6\" (UniqueName: \"kubernetes.io/projected/c6c59cfd-2add-4b4e-81c1-bacc77deae06-kube-api-access-8n4r6\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480884 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.480957 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a6fb1e-3b68-4210-9322-e13a634fac2a-config\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.481005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.481038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.481064 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-18682c74-0d19-45be-a095-30ed96badb39\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18682c74-0d19-45be-a095-30ed96badb39\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.481110 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.481148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c59cfd-2add-4b4e-81c1-bacc77deae06-config\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.482513 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c59cfd-2add-4b4e-81c1-bacc77deae06-config\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.482671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.486670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.486670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.499055 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c6c59cfd-2add-4b4e-81c1-bacc77deae06-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.501889 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.502025 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.502062 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9cde28105c94bd17206b92b63d4f5e4d2ad1d8b131bfdc8e06d18ed791b0836e/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.502218 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-18682c74-0d19-45be-a095-30ed96badb39\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18682c74-0d19-45be-a095-30ed96badb39\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c117815b366f006f1dd6a275fc305730dd42a097bb4b1a3303cc8289bb2bfdac/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.504015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4r6\" (UniqueName: \"kubernetes.io/projected/c6c59cfd-2add-4b4e-81c1-bacc77deae06-kube-api-access-8n4r6\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: W0318 15:49:14.514095 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0054d36_2f0d_43c8_93d2_774d775a22ea.slice/crio-7bd595778701b5e879a3778106375dd93823dab7b4a69e62712b3c0aaa259165 WatchSource:0}: Error finding container 7bd595778701b5e879a3778106375dd93823dab7b4a69e62712b3c0aaa259165: Status 404 returned error can't find the container with id 7bd595778701b5e879a3778106375dd93823dab7b4a69e62712b3c0aaa259165 Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.514724 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl"] Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.533601 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-18682c74-0d19-45be-a095-30ed96badb39\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18682c74-0d19-45be-a095-30ed96badb39\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.534162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e1479aa-cdc5-40cc-903e-ab87525d78d7\") pod \"logging-loki-ingester-0\" (UID: \"c6c59cfd-2add-4b4e-81c1-bacc77deae06\") " pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.581993 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582080 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9530b94d-2bb9-4e98-832f-07c4d8b2277a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582134 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltwf9\" (UniqueName: \"kubernetes.io/projected/d9a6fb1e-3b68-4210-9322-e13a634fac2a-kube-api-access-ltwf9\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582183 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582245 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582305 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jh6j\" (UniqueName: \"kubernetes.io/projected/9530b94d-2bb9-4e98-832f-07c4d8b2277a-kube-api-access-6jh6j\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a6fb1e-3b68-4210-9322-e13a634fac2a-config\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582387 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.582460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.590796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.591686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a6fb1e-3b68-4210-9322-e13a634fac2a-config\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.602689 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.621916 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltwf9\" (UniqueName: \"kubernetes.io/projected/d9a6fb1e-3b68-4210-9322-e13a634fac2a-kube-api-access-ltwf9\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.626849 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.628748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/d9a6fb1e-3b68-4210-9322-e13a634fac2a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.633272 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.633340 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8d4671b5d11ff6e0f54e3ef98824bc005ea1b1d37f21b6bff663b2c655581440/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.643541 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.688777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jh6j\" (UniqueName: \"kubernetes.io/projected/9530b94d-2bb9-4e98-832f-07c4d8b2277a-kube-api-access-6jh6j\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.688838 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.688882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.688918 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.688941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9530b94d-2bb9-4e98-832f-07c4d8b2277a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.688994 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.689012 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.690269 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.691995 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9530b94d-2bb9-4e98-832f-07c4d8b2277a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.699589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.702655 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.703817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9530b94d-2bb9-4e98-832f-07c4d8b2277a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.715267 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.715319 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9b182937da45739b3b089d85ceb72e822dedd004c70cba3e0aef6cd5375abadb/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.716187 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jh6j\" (UniqueName: \"kubernetes.io/projected/9530b94d-2bb9-4e98-832f-07c4d8b2277a-kube-api-access-6jh6j\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.743228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6bceda2a-fbac-4aee-8bdc-57db5aa1a368\") pod \"logging-loki-compactor-0\" (UID: \"d9a6fb1e-3b68-4210-9322-e13a634fac2a\") " pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.747785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e1d859e-e738-4e5f-a992-0f02e526c90e\") pod \"logging-loki-index-gateway-0\" (UID: \"9530b94d-2bb9-4e98-832f-07c4d8b2277a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.862307 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.894301 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" event={"ID":"8927cd79-8eff-4f53-a676-782cbb366e9c","Type":"ContainerStarted","Data":"c01cd1f5e36b811d68a7f486e2f31464adf206b95d47864ec4bdeacdc45ac79a"} Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.895463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" event={"ID":"e0054d36-2f0d-43c8-93d2-774d775a22ea","Type":"ContainerStarted","Data":"7bd595778701b5e879a3778106375dd93823dab7b4a69e62712b3c0aaa259165"} Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.896543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" event={"ID":"f9dbb2aa-f06a-431d-b181-29315e9170cb","Type":"ContainerStarted","Data":"b41188240ce3fbf3793120bb786023d425e60bf9bacaa0c4d1c737612042d139"} Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.898182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" event={"ID":"ff112f55-c823-4d01-a355-08279e6a0391","Type":"ContainerStarted","Data":"8f76dcaa5e41f8a99731998f57052cc1ccc6e38df8b4e0b4c55db26b32032de4"} Mar 18 15:49:14 crc kubenswrapper[4792]: I0318 15:49:14.900606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" event={"ID":"1c367fec-09d4-46fa-8900-0c508ced5de9","Type":"ContainerStarted","Data":"04421ce9fb5854edff1d5aec9875fdbdfc4009c5324023db6ebb9bf1220f50eb"} Mar 18 15:49:15 crc kubenswrapper[4792]: I0318 15:49:15.043544 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:15 crc kubenswrapper[4792]: I0318 15:49:15.100660 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 18 15:49:15 crc kubenswrapper[4792]: W0318 15:49:15.106723 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6c59cfd_2add_4b4e_81c1_bacc77deae06.slice/crio-220e2c76760c6f7d8b3d34404a93f8541bf754671050e3c1b794b404f91f3421 WatchSource:0}: Error finding container 220e2c76760c6f7d8b3d34404a93f8541bf754671050e3c1b794b404f91f3421: Status 404 returned error can't find the container with id 220e2c76760c6f7d8b3d34404a93f8541bf754671050e3c1b794b404f91f3421 Mar 18 15:49:15 crc kubenswrapper[4792]: I0318 15:49:15.245069 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 18 15:49:15 crc kubenswrapper[4792]: I0318 15:49:15.449072 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 18 15:49:15 crc kubenswrapper[4792]: I0318 15:49:15.907807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"d9a6fb1e-3b68-4210-9322-e13a634fac2a","Type":"ContainerStarted","Data":"626f7e9c86dcd287389ed7a72f1358925a4290ab89c1d3eb0634edb43c82625a"} Mar 18 15:49:15 crc kubenswrapper[4792]: I0318 15:49:15.909487 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c6c59cfd-2add-4b4e-81c1-bacc77deae06","Type":"ContainerStarted","Data":"220e2c76760c6f7d8b3d34404a93f8541bf754671050e3c1b794b404f91f3421"} Mar 18 15:49:15 crc kubenswrapper[4792]: I0318 15:49:15.911392 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"9530b94d-2bb9-4e98-832f-07c4d8b2277a","Type":"ContainerStarted","Data":"7f4788817962ade890c423ff9f68f4bd14edb53a1ba9f9bb47b22c41668e171d"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.939578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"9530b94d-2bb9-4e98-832f-07c4d8b2277a","Type":"ContainerStarted","Data":"242cd1b856096cd6a97c04f09a43c904e1bb86d9246f130e6019f7142880d1bb"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.941174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" event={"ID":"1c367fec-09d4-46fa-8900-0c508ced5de9","Type":"ContainerStarted","Data":"0bc45e92eacde4b99b788c2b2c9be96677ec79a9418f9b08f5347673c0db1f3d"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.944449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" event={"ID":"8927cd79-8eff-4f53-a676-782cbb366e9c","Type":"ContainerStarted","Data":"1e75599f47c6d310b916ec779464d1f6bb407582aeabef1803fa92afa43ad5f5"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.946929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"d9a6fb1e-3b68-4210-9322-e13a634fac2a","Type":"ContainerStarted","Data":"480f49870aa2289a32675a0bc49c59ffac21e38f3b1975bd7ce38e6112ab5248"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.948768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c6c59cfd-2add-4b4e-81c1-bacc77deae06","Type":"ContainerStarted","Data":"ddbf04aed77390fbb131085bd31b298402ae6510f37330e7e2df96915452c23a"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.950432 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" event={"ID":"e0054d36-2f0d-43c8-93d2-774d775a22ea","Type":"ContainerStarted","Data":"bf3319b6cec376cdcadebd09894eae26b899a604536f0046f8d18fdf581a2cbc"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.951914 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" event={"ID":"f9dbb2aa-f06a-431d-b181-29315e9170cb","Type":"ContainerStarted","Data":"10a29ab9dd3e008c60c53ba79947a69cac8bf69a24fe0ae44dcf507c42021f97"} Mar 18 15:49:18 crc kubenswrapper[4792]: I0318 15:49:18.953182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" event={"ID":"ff112f55-c823-4d01-a355-08279e6a0391","Type":"ContainerStarted","Data":"e4894a660e0360c46b15069b7df398cb762047b6574809b10d328e04dbb8252b"} Mar 18 15:49:19 crc kubenswrapper[4792]: I0318 15:49:19.958027 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:19 crc kubenswrapper[4792]: I0318 15:49:19.975605 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=4.20246611 podStartE2EDuration="6.975590038s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:15.255796922 +0000 UTC m=+904.125125859" lastFinishedPulling="2026-03-18 15:49:18.02892085 +0000 UTC m=+906.898249787" observedRunningTime="2026-03-18 15:49:19.973403177 +0000 UTC m=+908.842732124" watchObservedRunningTime="2026-03-18 15:49:19.975590038 +0000 UTC m=+908.844918965" Mar 18 15:49:21 crc kubenswrapper[4792]: I0318 15:49:21.973442 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:22 crc kubenswrapper[4792]: I0318 15:49:22.016168 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" podStartSLOduration=5.075040826 podStartE2EDuration="9.016152257s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:14.086548089 +0000 UTC m=+902.955877026" lastFinishedPulling="2026-03-18 15:49:18.02765952 +0000 UTC m=+906.896988457" observedRunningTime="2026-03-18 15:49:22.012155157 +0000 UTC m=+910.881484114" watchObservedRunningTime="2026-03-18 15:49:22.016152257 +0000 UTC m=+910.885481194" Mar 18 15:49:22 crc kubenswrapper[4792]: I0318 15:49:22.978916 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:49:22 crc kubenswrapper[4792]: I0318 15:49:22.980073 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:23 crc kubenswrapper[4792]: I0318 15:49:23.003426 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=7.138249318 podStartE2EDuration="10.003407504s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:15.109852142 +0000 UTC m=+903.979181079" lastFinishedPulling="2026-03-18 15:49:17.975010328 +0000 UTC m=+906.844339265" observedRunningTime="2026-03-18 15:49:23.001571725 +0000 UTC m=+911.870900702" watchObservedRunningTime="2026-03-18 15:49:23.003407504 +0000 UTC m=+911.872736441" Mar 18 15:49:23 crc kubenswrapper[4792]: I0318 15:49:23.041529 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" podStartSLOduration=6.225460915 podStartE2EDuration="10.041512136s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:14.192521965 +0000 UTC m=+903.061850902" lastFinishedPulling="2026-03-18 15:49:18.008573186 +0000 UTC m=+906.877902123" observedRunningTime="2026-03-18 15:49:23.041076042 +0000 UTC m=+911.910405009" watchObservedRunningTime="2026-03-18 15:49:23.041512136 +0000 UTC m=+911.910841073" Mar 18 15:49:23 crc kubenswrapper[4792]: I0318 15:49:23.043948 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=7.480905535 podStartE2EDuration="10.043939394s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:15.461661586 +0000 UTC m=+904.330990523" lastFinishedPulling="2026-03-18 15:49:18.024695445 +0000 UTC m=+906.894024382" observedRunningTime="2026-03-18 15:49:23.026262813 +0000 UTC m=+911.895591760" watchObservedRunningTime="2026-03-18 15:49:23.043939394 +0000 UTC m=+911.913268331" Mar 18 15:49:23 crc kubenswrapper[4792]: I0318 15:49:23.069110 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" podStartSLOduration=6.124546316 podStartE2EDuration="10.069092467s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:14.084207904 +0000 UTC m=+902.953536851" lastFinishedPulling="2026-03-18 15:49:18.028754065 +0000 UTC m=+906.898083002" observedRunningTime="2026-03-18 15:49:23.059791966 +0000 UTC m=+911.929120903" watchObservedRunningTime="2026-03-18 15:49:23.069092467 +0000 UTC m=+911.938421404" Mar 18 15:49:23 crc kubenswrapper[4792]: I0318 15:49:23.637784 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:23 crc kubenswrapper[4792]: I0318 15:49:23.767337 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.003671 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" event={"ID":"e0054d36-2f0d-43c8-93d2-774d775a22ea","Type":"ContainerStarted","Data":"71de88b67fe2627cde9838b68e6c7e395846138c1535ca4b1ab0c8d0548eb62a"} Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.006369 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.006519 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.007882 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" event={"ID":"ff112f55-c823-4d01-a355-08279e6a0391","Type":"ContainerStarted","Data":"6f790a4c5971484928df3b430f332a56eb53483eff14c9d6c7861c04087357e5"} Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.008239 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.008273 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.019537 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.022340 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.026206 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.027706 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.045750 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podStartSLOduration=2.640271122 podStartE2EDuration="13.045731049s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:14.518194559 +0000 UTC m=+903.387523496" lastFinishedPulling="2026-03-18 15:49:24.923654486 +0000 UTC m=+913.792983423" observedRunningTime="2026-03-18 15:49:26.036841273 +0000 UTC m=+914.906170230" watchObservedRunningTime="2026-03-18 15:49:26.045731049 +0000 UTC m=+914.915059986" Mar 18 15:49:26 crc kubenswrapper[4792]: I0318 15:49:26.100759 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podStartSLOduration=2.391975845 podStartE2EDuration="13.100743887s" podCreationTimestamp="2026-03-18 15:49:13 +0000 UTC" firstStartedPulling="2026-03-18 15:49:14.220111952 +0000 UTC m=+903.089440889" lastFinishedPulling="2026-03-18 15:49:24.928879984 +0000 UTC m=+913.798208931" observedRunningTime="2026-03-18 15:49:26.092593434 +0000 UTC m=+914.961922371" watchObservedRunningTime="2026-03-18 15:49:26.100743887 +0000 UTC m=+914.970072824" Mar 18 15:49:30 crc kubenswrapper[4792]: I0318 15:49:30.322556 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:49:30 crc kubenswrapper[4792]: I0318 15:49:30.322932 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:49:33 crc kubenswrapper[4792]: I0318 15:49:33.482589 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 15:49:33 crc kubenswrapper[4792]: I0318 15:49:33.641787 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 15:49:33 crc kubenswrapper[4792]: I0318 15:49:33.772330 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 15:49:34 crc kubenswrapper[4792]: I0318 15:49:34.651013 4792 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 18 15:49:34 crc kubenswrapper[4792]: I0318 15:49:34.651095 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c6c59cfd-2add-4b4e-81c1-bacc77deae06" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 15:49:34 crc kubenswrapper[4792]: I0318 15:49:34.945384 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 18 15:49:35 crc kubenswrapper[4792]: I0318 15:49:35.049604 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 18 15:49:44 crc kubenswrapper[4792]: I0318 15:49:44.649186 4792 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 18 15:49:44 crc kubenswrapper[4792]: I0318 15:49:44.649750 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c6c59cfd-2add-4b4e-81c1-bacc77deae06" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 15:49:54 crc kubenswrapper[4792]: I0318 15:49:54.648758 4792 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 18 15:49:54 crc kubenswrapper[4792]: I0318 15:49:54.649289 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c6c59cfd-2add-4b4e-81c1-bacc77deae06" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.137911 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564150-gtk9r"] Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.139721 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-gtk9r" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.142150 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.142164 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.142509 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.153556 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-gtk9r"] Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.220404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br992\" (UniqueName: \"kubernetes.io/projected/5f563e18-52c5-4193-b9e9-70d632536974-kube-api-access-br992\") pod \"auto-csr-approver-29564150-gtk9r\" (UID: \"5f563e18-52c5-4193-b9e9-70d632536974\") " pod="openshift-infra/auto-csr-approver-29564150-gtk9r" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.321611 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.321687 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.322243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br992\" (UniqueName: \"kubernetes.io/projected/5f563e18-52c5-4193-b9e9-70d632536974-kube-api-access-br992\") pod \"auto-csr-approver-29564150-gtk9r\" (UID: \"5f563e18-52c5-4193-b9e9-70d632536974\") " pod="openshift-infra/auto-csr-approver-29564150-gtk9r" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.341074 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br992\" (UniqueName: \"kubernetes.io/projected/5f563e18-52c5-4193-b9e9-70d632536974-kube-api-access-br992\") pod \"auto-csr-approver-29564150-gtk9r\" (UID: \"5f563e18-52c5-4193-b9e9-70d632536974\") " pod="openshift-infra/auto-csr-approver-29564150-gtk9r" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.494728 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-gtk9r" Mar 18 15:50:00 crc kubenswrapper[4792]: I0318 15:50:00.910850 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-gtk9r"] Mar 18 15:50:01 crc kubenswrapper[4792]: I0318 15:50:01.270158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-gtk9r" event={"ID":"5f563e18-52c5-4193-b9e9-70d632536974","Type":"ContainerStarted","Data":"94f1303258d06f9210b4918ed8e7054166261585f909cb5a67590934792cbaa5"} Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.294527 4792 generic.go:334] "Generic (PLEG): container finished" podID="5f563e18-52c5-4193-b9e9-70d632536974" containerID="0f1ad947bfab13337bc7241c0610d2c4eb91bc8bb5dd93a30b161eed8e766a2d" exitCode=0 Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.294644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-gtk9r" event={"ID":"5f563e18-52c5-4193-b9e9-70d632536974","Type":"ContainerDied","Data":"0f1ad947bfab13337bc7241c0610d2c4eb91bc8bb5dd93a30b161eed8e766a2d"} Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.474329 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcmm"] Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.475686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.493235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcmm"] Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.565879 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-utilities\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.565936 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-catalog-content\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.566033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxc82\" (UniqueName: \"kubernetes.io/projected/b90f3673-7c3b-4466-8961-9b7363676c21-kube-api-access-rxc82\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.667096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-utilities\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.667157 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-catalog-content\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.667248 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxc82\" (UniqueName: \"kubernetes.io/projected/b90f3673-7c3b-4466-8961-9b7363676c21-kube-api-access-rxc82\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.667668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-utilities\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.667717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-catalog-content\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.693894 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxc82\" (UniqueName: \"kubernetes.io/projected/b90f3673-7c3b-4466-8961-9b7363676c21-kube-api-access-rxc82\") pod \"redhat-marketplace-4rcmm\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:03 crc kubenswrapper[4792]: I0318 15:50:03.793403 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.233458 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcmm"] Mar 18 15:50:04 crc kubenswrapper[4792]: W0318 15:50:04.237164 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90f3673_7c3b_4466_8961_9b7363676c21.slice/crio-27cb2f4a3741a009eb9e2c465f59fcc6be8619d7527c3cbe15d2584d7ff8bdf8 WatchSource:0}: Error finding container 27cb2f4a3741a009eb9e2c465f59fcc6be8619d7527c3cbe15d2584d7ff8bdf8: Status 404 returned error can't find the container with id 27cb2f4a3741a009eb9e2c465f59fcc6be8619d7527c3cbe15d2584d7ff8bdf8 Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.302858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcmm" event={"ID":"b90f3673-7c3b-4466-8961-9b7363676c21","Type":"ContainerStarted","Data":"27cb2f4a3741a009eb9e2c465f59fcc6be8619d7527c3cbe15d2584d7ff8bdf8"} Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.520020 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-gtk9r" Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.649671 4792 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.649733 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c6c59cfd-2add-4b4e-81c1-bacc77deae06" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.681661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br992\" (UniqueName: \"kubernetes.io/projected/5f563e18-52c5-4193-b9e9-70d632536974-kube-api-access-br992\") pod \"5f563e18-52c5-4193-b9e9-70d632536974\" (UID: \"5f563e18-52c5-4193-b9e9-70d632536974\") " Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.686516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f563e18-52c5-4193-b9e9-70d632536974-kube-api-access-br992" (OuterVolumeSpecName: "kube-api-access-br992") pod "5f563e18-52c5-4193-b9e9-70d632536974" (UID: "5f563e18-52c5-4193-b9e9-70d632536974"). InnerVolumeSpecName "kube-api-access-br992". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:04 crc kubenswrapper[4792]: I0318 15:50:04.784549 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br992\" (UniqueName: \"kubernetes.io/projected/5f563e18-52c5-4193-b9e9-70d632536974-kube-api-access-br992\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.312777 4792 generic.go:334] "Generic (PLEG): container finished" podID="b90f3673-7c3b-4466-8961-9b7363676c21" containerID="d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3" exitCode=0 Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.312843 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcmm" event={"ID":"b90f3673-7c3b-4466-8961-9b7363676c21","Type":"ContainerDied","Data":"d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3"} Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.316073 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564150-gtk9r" event={"ID":"5f563e18-52c5-4193-b9e9-70d632536974","Type":"ContainerDied","Data":"94f1303258d06f9210b4918ed8e7054166261585f909cb5a67590934792cbaa5"} Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.316114 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94f1303258d06f9210b4918ed8e7054166261585f909cb5a67590934792cbaa5" Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.316135 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564150-gtk9r" Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.591664 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-nl5nl"] Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.597601 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564144-nl5nl"] Mar 18 15:50:05 crc kubenswrapper[4792]: I0318 15:50:05.866250 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b441fdb2-bae8-499e-b5a5-1872dcd1f704" path="/var/lib/kubelet/pods/b441fdb2-bae8-499e-b5a5-1872dcd1f704/volumes" Mar 18 15:50:06 crc kubenswrapper[4792]: I0318 15:50:06.332676 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcmm" event={"ID":"b90f3673-7c3b-4466-8961-9b7363676c21","Type":"ContainerStarted","Data":"e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808"} Mar 18 15:50:07 crc kubenswrapper[4792]: I0318 15:50:07.341168 4792 generic.go:334] "Generic (PLEG): container finished" podID="b90f3673-7c3b-4466-8961-9b7363676c21" containerID="e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808" exitCode=0 Mar 18 15:50:07 crc kubenswrapper[4792]: I0318 15:50:07.341227 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcmm" event={"ID":"b90f3673-7c3b-4466-8961-9b7363676c21","Type":"ContainerDied","Data":"e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808"} Mar 18 15:50:08 crc kubenswrapper[4792]: I0318 15:50:08.352302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcmm" event={"ID":"b90f3673-7c3b-4466-8961-9b7363676c21","Type":"ContainerStarted","Data":"44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967"} Mar 18 15:50:08 crc kubenswrapper[4792]: I0318 15:50:08.372447 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rcmm" podStartSLOduration=2.877966069 podStartE2EDuration="5.372428534s" podCreationTimestamp="2026-03-18 15:50:03 +0000 UTC" firstStartedPulling="2026-03-18 15:50:05.314902518 +0000 UTC m=+954.184231455" lastFinishedPulling="2026-03-18 15:50:07.809364983 +0000 UTC m=+956.678693920" observedRunningTime="2026-03-18 15:50:08.367215196 +0000 UTC m=+957.236544153" watchObservedRunningTime="2026-03-18 15:50:08.372428534 +0000 UTC m=+957.241757471" Mar 18 15:50:13 crc kubenswrapper[4792]: I0318 15:50:13.794094 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:13 crc kubenswrapper[4792]: I0318 15:50:13.794449 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:13 crc kubenswrapper[4792]: I0318 15:50:13.845882 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:14 crc kubenswrapper[4792]: I0318 15:50:14.437216 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:14 crc kubenswrapper[4792]: I0318 15:50:14.475695 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcmm"] Mar 18 15:50:14 crc kubenswrapper[4792]: I0318 15:50:14.649118 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.398589 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rcmm" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="registry-server" containerID="cri-o://44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967" gracePeriod=2 Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.761569 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.868360 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-catalog-content\") pod \"b90f3673-7c3b-4466-8961-9b7363676c21\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.868521 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-utilities\") pod \"b90f3673-7c3b-4466-8961-9b7363676c21\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.868585 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxc82\" (UniqueName: \"kubernetes.io/projected/b90f3673-7c3b-4466-8961-9b7363676c21-kube-api-access-rxc82\") pod \"b90f3673-7c3b-4466-8961-9b7363676c21\" (UID: \"b90f3673-7c3b-4466-8961-9b7363676c21\") " Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.869190 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-utilities" (OuterVolumeSpecName: "utilities") pod "b90f3673-7c3b-4466-8961-9b7363676c21" (UID: "b90f3673-7c3b-4466-8961-9b7363676c21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.874497 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90f3673-7c3b-4466-8961-9b7363676c21-kube-api-access-rxc82" (OuterVolumeSpecName: "kube-api-access-rxc82") pod "b90f3673-7c3b-4466-8961-9b7363676c21" (UID: "b90f3673-7c3b-4466-8961-9b7363676c21"). InnerVolumeSpecName "kube-api-access-rxc82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.897199 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b90f3673-7c3b-4466-8961-9b7363676c21" (UID: "b90f3673-7c3b-4466-8961-9b7363676c21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.970157 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.970440 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxc82\" (UniqueName: \"kubernetes.io/projected/b90f3673-7c3b-4466-8961-9b7363676c21-kube-api-access-rxc82\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:16 crc kubenswrapper[4792]: I0318 15:50:16.970537 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90f3673-7c3b-4466-8961-9b7363676c21-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.407804 4792 generic.go:334] "Generic (PLEG): container finished" podID="b90f3673-7c3b-4466-8961-9b7363676c21" containerID="44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967" exitCode=0 Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.407842 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcmm" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.407850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcmm" event={"ID":"b90f3673-7c3b-4466-8961-9b7363676c21","Type":"ContainerDied","Data":"44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967"} Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.407961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcmm" event={"ID":"b90f3673-7c3b-4466-8961-9b7363676c21","Type":"ContainerDied","Data":"27cb2f4a3741a009eb9e2c465f59fcc6be8619d7527c3cbe15d2584d7ff8bdf8"} Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.407996 4792 scope.go:117] "RemoveContainer" containerID="44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.432135 4792 scope.go:117] "RemoveContainer" containerID="e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.438237 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcmm"] Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.443947 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcmm"] Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.475140 4792 scope.go:117] "RemoveContainer" containerID="d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.490261 4792 scope.go:117] "RemoveContainer" containerID="44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967" Mar 18 15:50:17 crc kubenswrapper[4792]: E0318 15:50:17.490742 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967\": container with ID starting with 44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967 not found: ID does not exist" containerID="44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.490784 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967"} err="failed to get container status \"44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967\": rpc error: code = NotFound desc = could not find container \"44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967\": container with ID starting with 44528f2c37932a302101157ead972d2a32f39b5bbf4a17c196847de62a2a1967 not found: ID does not exist" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.490832 4792 scope.go:117] "RemoveContainer" containerID="e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808" Mar 18 15:50:17 crc kubenswrapper[4792]: E0318 15:50:17.491261 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808\": container with ID starting with e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808 not found: ID does not exist" containerID="e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.491309 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808"} err="failed to get container status \"e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808\": rpc error: code = NotFound desc = could not find container \"e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808\": container with ID starting with e366025eaa81b4050970e4e9a1f0b9232b948242a62c9062aeb4057ad33a9808 not found: ID does not exist" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.491353 4792 scope.go:117] "RemoveContainer" containerID="d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3" Mar 18 15:50:17 crc kubenswrapper[4792]: E0318 15:50:17.491606 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3\": container with ID starting with d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3 not found: ID does not exist" containerID="d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.491631 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3"} err="failed to get container status \"d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3\": rpc error: code = NotFound desc = could not find container \"d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3\": container with ID starting with d130f704b32e18ed6f50ac01ad69fdf67f0cd73da7f18dcb4f455eedf6434fa3 not found: ID does not exist" Mar 18 15:50:17 crc kubenswrapper[4792]: I0318 15:50:17.865030 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" path="/var/lib/kubelet/pods/b90f3673-7c3b-4466-8961-9b7363676c21/volumes" Mar 18 15:50:26 crc kubenswrapper[4792]: I0318 15:50:26.220183 4792 scope.go:117] "RemoveContainer" containerID="c21f868d9a176b0049aedfd41baf14e6ca159cc91748af1aebf646656ea0cab2" Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.321590 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.322557 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.322628 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.323664 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f33d47ada8d06b2ac36e49a83c544decfab94cfff67a570889a75d2335bcd957"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.323732 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://f33d47ada8d06b2ac36e49a83c544decfab94cfff67a570889a75d2335bcd957" gracePeriod=600 Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.508669 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="f33d47ada8d06b2ac36e49a83c544decfab94cfff67a570889a75d2335bcd957" exitCode=0 Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.508739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"f33d47ada8d06b2ac36e49a83c544decfab94cfff67a570889a75d2335bcd957"} Mar 18 15:50:30 crc kubenswrapper[4792]: I0318 15:50:30.508811 4792 scope.go:117] "RemoveContainer" containerID="ff8ccb857e3db00243a27bea97ff36a63ee591214974ddeca1d87a9822ae051d" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.293055 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-9vlnr"] Mar 18 15:50:31 crc kubenswrapper[4792]: E0318 15:50:31.293815 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="extract-utilities" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.293828 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="extract-utilities" Mar 18 15:50:31 crc kubenswrapper[4792]: E0318 15:50:31.293854 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f563e18-52c5-4193-b9e9-70d632536974" containerName="oc" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.293860 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f563e18-52c5-4193-b9e9-70d632536974" containerName="oc" Mar 18 15:50:31 crc kubenswrapper[4792]: E0318 15:50:31.293866 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="extract-content" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.293874 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="extract-content" Mar 18 15:50:31 crc kubenswrapper[4792]: E0318 15:50:31.293883 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="registry-server" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.293890 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="registry-server" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.294024 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f563e18-52c5-4193-b9e9-70d632536974" containerName="oc" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.294034 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90f3673-7c3b-4466-8961-9b7363676c21" containerName="registry-server" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.294520 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.298785 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.298797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-tngqp" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.298785 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.299893 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.300692 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.308122 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.312748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2e8de5df-ac14-46b1-a734-103ea0a930c6-datadir\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.312781 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.312809 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.312907 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e8de5df-ac14-46b1-a734-103ea0a930c6-tmp\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.312933 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config-openshift-service-cacrt\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.313021 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-syslog-receiver\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.313056 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-trusted-ca\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.313120 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-entrypoint\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.313190 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwr8p\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-kube-api-access-lwr8p\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.313277 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-token\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.313301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-sa-token\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.332347 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-9vlnr"] Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.381121 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-9vlnr"] Mar 18 15:50:31 crc kubenswrapper[4792]: E0318 15:50:31.381658 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-lwr8p metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-9vlnr" podUID="2e8de5df-ac14-46b1-a734-103ea0a930c6" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415048 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2e8de5df-ac14-46b1-a734-103ea0a930c6-datadir\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e8de5df-ac14-46b1-a734-103ea0a930c6-tmp\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415173 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config-openshift-service-cacrt\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415205 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-syslog-receiver\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415225 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-trusted-ca\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-entrypoint\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415267 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwr8p\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-kube-api-access-lwr8p\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-token\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415310 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-sa-token\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.415334 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2e8de5df-ac14-46b1-a734-103ea0a930c6-datadir\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.416063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.416222 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-trusted-ca\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.416921 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-entrypoint\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.417844 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config-openshift-service-cacrt\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: E0318 15:50:31.417931 4792 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 18 15:50:31 crc kubenswrapper[4792]: E0318 15:50:31.417996 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics podName:2e8de5df-ac14-46b1-a734-103ea0a930c6 nodeName:}" failed. No retries permitted until 2026-03-18 15:50:31.917959299 +0000 UTC m=+980.787288326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics") pod "collector-9vlnr" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6") : secret "collector-metrics" not found Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.433395 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e8de5df-ac14-46b1-a734-103ea0a930c6-tmp\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.434000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-token\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.436226 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-syslog-receiver\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.436909 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-sa-token\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.447172 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwr8p\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-kube-api-access-lwr8p\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.509488 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q7c69"] Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.511056 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.522414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"f2f3b1d9e5efb71a659892b0519133711d0d4a704a137b617addb0c6d53c19c2"} Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.522431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.526525 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q7c69"] Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.537941 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.618475 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-catalog-content\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.618597 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-utilities\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.618665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfsmx\" (UniqueName: \"kubernetes.io/projected/5d5b1b34-0c52-444d-8757-7575858a31e1-kube-api-access-cfsmx\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720084 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-token\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720178 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e8de5df-ac14-46b1-a734-103ea0a930c6-tmp\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720213 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2e8de5df-ac14-46b1-a734-103ea0a930c6-datadir\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-trusted-ca\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720260 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwr8p\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-kube-api-access-lwr8p\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720342 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config-openshift-service-cacrt\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-sa-token\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720410 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-entrypoint\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720431 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-syslog-receiver\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720663 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-utilities\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfsmx\" (UniqueName: \"kubernetes.io/projected/5d5b1b34-0c52-444d-8757-7575858a31e1-kube-api-access-cfsmx\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.720801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-catalog-content\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.721271 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-catalog-content\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.721354 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config" (OuterVolumeSpecName: "config") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.721536 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.721582 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e8de5df-ac14-46b1-a734-103ea0a930c6-datadir" (OuterVolumeSpecName: "datadir") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.721749 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.721912 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-utilities\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.721931 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.724130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-kube-api-access-lwr8p" (OuterVolumeSpecName: "kube-api-access-lwr8p") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "kube-api-access-lwr8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.724229 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-sa-token" (OuterVolumeSpecName: "sa-token") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.725400 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-token" (OuterVolumeSpecName: "collector-token") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.727128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.736157 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e8de5df-ac14-46b1-a734-103ea0a930c6-tmp" (OuterVolumeSpecName: "tmp") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.746290 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfsmx\" (UniqueName: \"kubernetes.io/projected/5d5b1b34-0c52-444d-8757-7575858a31e1-kube-api-access-cfsmx\") pod \"community-operators-q7c69\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822714 4792 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822743 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822753 4792 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e8de5df-ac14-46b1-a734-103ea0a930c6-tmp\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822761 4792 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/2e8de5df-ac14-46b1-a734-103ea0a930c6-datadir\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822769 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822778 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwr8p\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-kube-api-access-lwr8p\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822793 4792 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822802 4792 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/2e8de5df-ac14-46b1-a734-103ea0a930c6-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822812 4792 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/2e8de5df-ac14-46b1-a734-103ea0a930c6-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.822823 4792 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.835537 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.924078 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:31 crc kubenswrapper[4792]: I0318 15:50:31.928912 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics\") pod \"collector-9vlnr\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " pod="openshift-logging/collector-9vlnr" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.128055 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics\") pod \"2e8de5df-ac14-46b1-a734-103ea0a930c6\" (UID: \"2e8de5df-ac14-46b1-a734-103ea0a930c6\") " Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.132727 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics" (OuterVolumeSpecName: "metrics") pod "2e8de5df-ac14-46b1-a734-103ea0a930c6" (UID: "2e8de5df-ac14-46b1-a734-103ea0a930c6"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.230012 4792 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/2e8de5df-ac14-46b1-a734-103ea0a930c6-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.361999 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q7c69"] Mar 18 15:50:32 crc kubenswrapper[4792]: W0318 15:50:32.369596 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d5b1b34_0c52_444d_8757_7575858a31e1.slice/crio-b9ca375027f83bbdd1c6cdf7e02bb1660ee368bfaf9c5e68d8ba12a079ec91d0 WatchSource:0}: Error finding container b9ca375027f83bbdd1c6cdf7e02bb1660ee368bfaf9c5e68d8ba12a079ec91d0: Status 404 returned error can't find the container with id b9ca375027f83bbdd1c6cdf7e02bb1660ee368bfaf9c5e68d8ba12a079ec91d0 Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.530407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7c69" event={"ID":"5d5b1b34-0c52-444d-8757-7575858a31e1","Type":"ContainerStarted","Data":"b9ca375027f83bbdd1c6cdf7e02bb1660ee368bfaf9c5e68d8ba12a079ec91d0"} Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.530426 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9vlnr" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.587999 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-9vlnr"] Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.596628 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-9vlnr"] Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.609036 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-pmw4h"] Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.610180 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.613720 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.614125 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.615040 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-tngqp" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.615062 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.615288 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.619859 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.620543 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-pmw4h"] Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639251 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-metrics\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-collector-token\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639347 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-sa-token\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-trusted-ca\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639470 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-entrypoint\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639492 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-tmp\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639517 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-collector-syslog-receiver\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639548 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8bg\" (UniqueName: \"kubernetes.io/projected/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-kube-api-access-pb8bg\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-config-openshift-service-cacrt\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-config\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.639601 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-datadir\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-trusted-ca\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-entrypoint\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-tmp\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-collector-syslog-receiver\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8bg\" (UniqueName: \"kubernetes.io/projected/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-kube-api-access-pb8bg\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-config-openshift-service-cacrt\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-config\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740661 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-datadir\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740679 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-metrics\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740708 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-collector-token\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.740725 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-sa-token\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.741166 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-datadir\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.741442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-entrypoint\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.741731 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-config-openshift-service-cacrt\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.741834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-trusted-ca\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.741960 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-config\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.744023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-collector-syslog-receiver\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.744051 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-tmp\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.748406 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-metrics\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.748485 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-collector-token\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.758020 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8bg\" (UniqueName: \"kubernetes.io/projected/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-kube-api-access-pb8bg\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.760210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae-sa-token\") pod \"collector-pmw4h\" (UID: \"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae\") " pod="openshift-logging/collector-pmw4h" Mar 18 15:50:32 crc kubenswrapper[4792]: I0318 15:50:32.994889 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-pmw4h" Mar 18 15:50:33 crc kubenswrapper[4792]: I0318 15:50:33.419258 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-pmw4h"] Mar 18 15:50:33 crc kubenswrapper[4792]: I0318 15:50:33.537848 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-pmw4h" event={"ID":"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae","Type":"ContainerStarted","Data":"2c4732f78b93d6c9ddfd0dd572daf6588529c40dfbbc9227f78c18e836b6daf2"} Mar 18 15:50:33 crc kubenswrapper[4792]: I0318 15:50:33.539247 4792 generic.go:334] "Generic (PLEG): container finished" podID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerID="740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b" exitCode=0 Mar 18 15:50:33 crc kubenswrapper[4792]: I0318 15:50:33.539289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7c69" event={"ID":"5d5b1b34-0c52-444d-8757-7575858a31e1","Type":"ContainerDied","Data":"740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b"} Mar 18 15:50:33 crc kubenswrapper[4792]: I0318 15:50:33.890301 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8de5df-ac14-46b1-a734-103ea0a930c6" path="/var/lib/kubelet/pods/2e8de5df-ac14-46b1-a734-103ea0a930c6/volumes" Mar 18 15:50:34 crc kubenswrapper[4792]: I0318 15:50:34.547463 4792 generic.go:334] "Generic (PLEG): container finished" podID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerID="1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b" exitCode=0 Mar 18 15:50:34 crc kubenswrapper[4792]: I0318 15:50:34.547542 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7c69" event={"ID":"5d5b1b34-0c52-444d-8757-7575858a31e1","Type":"ContainerDied","Data":"1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b"} Mar 18 15:50:35 crc kubenswrapper[4792]: I0318 15:50:35.556510 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7c69" event={"ID":"5d5b1b34-0c52-444d-8757-7575858a31e1","Type":"ContainerStarted","Data":"131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e"} Mar 18 15:50:35 crc kubenswrapper[4792]: I0318 15:50:35.579009 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q7c69" podStartSLOduration=3.129650701 podStartE2EDuration="4.578987998s" podCreationTimestamp="2026-03-18 15:50:31 +0000 UTC" firstStartedPulling="2026-03-18 15:50:33.542317945 +0000 UTC m=+982.411646882" lastFinishedPulling="2026-03-18 15:50:34.991655242 +0000 UTC m=+983.860984179" observedRunningTime="2026-03-18 15:50:35.575180236 +0000 UTC m=+984.444509183" watchObservedRunningTime="2026-03-18 15:50:35.578987998 +0000 UTC m=+984.448316955" Mar 18 15:50:37 crc kubenswrapper[4792]: I0318 15:50:37.572323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-pmw4h" event={"ID":"9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae","Type":"ContainerStarted","Data":"a2c6d9fca0be8d18fc71f4b0779d025472407e85061e603efa584165ca51d314"} Mar 18 15:50:37 crc kubenswrapper[4792]: I0318 15:50:37.595934 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-pmw4h" podStartSLOduration=2.051960612 podStartE2EDuration="5.595908444s" podCreationTimestamp="2026-03-18 15:50:32 +0000 UTC" firstStartedPulling="2026-03-18 15:50:33.431721302 +0000 UTC m=+982.301050239" lastFinishedPulling="2026-03-18 15:50:36.975669144 +0000 UTC m=+985.844998071" observedRunningTime="2026-03-18 15:50:37.593530547 +0000 UTC m=+986.462859504" watchObservedRunningTime="2026-03-18 15:50:37.595908444 +0000 UTC m=+986.465237391" Mar 18 15:50:41 crc kubenswrapper[4792]: I0318 15:50:41.836369 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:41 crc kubenswrapper[4792]: I0318 15:50:41.836921 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:41 crc kubenswrapper[4792]: I0318 15:50:41.882915 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:42 crc kubenswrapper[4792]: I0318 15:50:42.646489 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:42 crc kubenswrapper[4792]: I0318 15:50:42.690188 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q7c69"] Mar 18 15:50:44 crc kubenswrapper[4792]: I0318 15:50:44.622690 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q7c69" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="registry-server" containerID="cri-o://131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e" gracePeriod=2 Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.062816 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.259843 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-catalog-content\") pod \"5d5b1b34-0c52-444d-8757-7575858a31e1\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.259945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-utilities\") pod \"5d5b1b34-0c52-444d-8757-7575858a31e1\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.260018 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfsmx\" (UniqueName: \"kubernetes.io/projected/5d5b1b34-0c52-444d-8757-7575858a31e1-kube-api-access-cfsmx\") pod \"5d5b1b34-0c52-444d-8757-7575858a31e1\" (UID: \"5d5b1b34-0c52-444d-8757-7575858a31e1\") " Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.260840 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-utilities" (OuterVolumeSpecName: "utilities") pod "5d5b1b34-0c52-444d-8757-7575858a31e1" (UID: "5d5b1b34-0c52-444d-8757-7575858a31e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.268618 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5b1b34-0c52-444d-8757-7575858a31e1-kube-api-access-cfsmx" (OuterVolumeSpecName: "kube-api-access-cfsmx") pod "5d5b1b34-0c52-444d-8757-7575858a31e1" (UID: "5d5b1b34-0c52-444d-8757-7575858a31e1"). InnerVolumeSpecName "kube-api-access-cfsmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.310330 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d5b1b34-0c52-444d-8757-7575858a31e1" (UID: "5d5b1b34-0c52-444d-8757-7575858a31e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.361756 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.361786 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5b1b34-0c52-444d-8757-7575858a31e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.361798 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfsmx\" (UniqueName: \"kubernetes.io/projected/5d5b1b34-0c52-444d-8757-7575858a31e1-kube-api-access-cfsmx\") on node \"crc\" DevicePath \"\"" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.632781 4792 generic.go:334] "Generic (PLEG): container finished" podID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerID="131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e" exitCode=0 Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.632835 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q7c69" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.632834 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7c69" event={"ID":"5d5b1b34-0c52-444d-8757-7575858a31e1","Type":"ContainerDied","Data":"131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e"} Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.634066 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q7c69" event={"ID":"5d5b1b34-0c52-444d-8757-7575858a31e1","Type":"ContainerDied","Data":"b9ca375027f83bbdd1c6cdf7e02bb1660ee368bfaf9c5e68d8ba12a079ec91d0"} Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.634087 4792 scope.go:117] "RemoveContainer" containerID="131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.653577 4792 scope.go:117] "RemoveContainer" containerID="1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.664599 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q7c69"] Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.668839 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q7c69"] Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.675025 4792 scope.go:117] "RemoveContainer" containerID="740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.709461 4792 scope.go:117] "RemoveContainer" containerID="131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e" Mar 18 15:50:45 crc kubenswrapper[4792]: E0318 15:50:45.709944 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e\": container with ID starting with 131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e not found: ID does not exist" containerID="131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.710060 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e"} err="failed to get container status \"131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e\": rpc error: code = NotFound desc = could not find container \"131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e\": container with ID starting with 131afde8cc183d05afd3b0159717e523c0ecf0e964b201310e2dffde0de5154e not found: ID does not exist" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.710087 4792 scope.go:117] "RemoveContainer" containerID="1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b" Mar 18 15:50:45 crc kubenswrapper[4792]: E0318 15:50:45.710440 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b\": container with ID starting with 1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b not found: ID does not exist" containerID="1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.710486 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b"} err="failed to get container status \"1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b\": rpc error: code = NotFound desc = could not find container \"1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b\": container with ID starting with 1a8959481369ca82b3ab28ed779f3f0e0dc3efada16ed50173c28f7ce88e616b not found: ID does not exist" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.710519 4792 scope.go:117] "RemoveContainer" containerID="740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b" Mar 18 15:50:45 crc kubenswrapper[4792]: E0318 15:50:45.710834 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b\": container with ID starting with 740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b not found: ID does not exist" containerID="740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.710866 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b"} err="failed to get container status \"740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b\": rpc error: code = NotFound desc = could not find container \"740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b\": container with ID starting with 740261af040a578f515ddc5078347f025b3f92c0e7056375a169b2d680f98a8b not found: ID does not exist" Mar 18 15:50:45 crc kubenswrapper[4792]: I0318 15:50:45.864721 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" path="/var/lib/kubelet/pods/5d5b1b34-0c52-444d-8757-7575858a31e1/volumes" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.522706 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c7bd2"] Mar 18 15:50:47 crc kubenswrapper[4792]: E0318 15:50:47.523064 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="extract-content" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.523080 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="extract-content" Mar 18 15:50:47 crc kubenswrapper[4792]: E0318 15:50:47.523091 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="registry-server" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.523098 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="registry-server" Mar 18 15:50:47 crc kubenswrapper[4792]: E0318 15:50:47.523107 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="extract-utilities" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.523115 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="extract-utilities" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.523285 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5b1b34-0c52-444d-8757-7575858a31e1" containerName="registry-server" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.524460 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.536763 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c7bd2"] Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.696667 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-772kb\" (UniqueName: \"kubernetes.io/projected/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-kube-api-access-772kb\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.696726 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-utilities\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.696763 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-catalog-content\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.798277 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-772kb\" (UniqueName: \"kubernetes.io/projected/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-kube-api-access-772kb\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.798340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-utilities\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.798378 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-catalog-content\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.798904 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-utilities\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.799044 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-catalog-content\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.818195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-772kb\" (UniqueName: \"kubernetes.io/projected/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-kube-api-access-772kb\") pod \"certified-operators-c7bd2\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:47 crc kubenswrapper[4792]: I0318 15:50:47.850462 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:48 crc kubenswrapper[4792]: I0318 15:50:48.410168 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c7bd2"] Mar 18 15:50:48 crc kubenswrapper[4792]: W0318 15:50:48.411605 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dca5f4c_7afc_4e6c_8ca8_46a0a54f1fe3.slice/crio-17c41a51cebec2014ace0970d6a8133d8f3533b1e5068b08d9efa49b8cc32fbf WatchSource:0}: Error finding container 17c41a51cebec2014ace0970d6a8133d8f3533b1e5068b08d9efa49b8cc32fbf: Status 404 returned error can't find the container with id 17c41a51cebec2014ace0970d6a8133d8f3533b1e5068b08d9efa49b8cc32fbf Mar 18 15:50:48 crc kubenswrapper[4792]: I0318 15:50:48.655886 4792 generic.go:334] "Generic (PLEG): container finished" podID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerID="63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043" exitCode=0 Mar 18 15:50:48 crc kubenswrapper[4792]: I0318 15:50:48.655926 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bd2" event={"ID":"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3","Type":"ContainerDied","Data":"63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043"} Mar 18 15:50:48 crc kubenswrapper[4792]: I0318 15:50:48.655961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bd2" event={"ID":"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3","Type":"ContainerStarted","Data":"17c41a51cebec2014ace0970d6a8133d8f3533b1e5068b08d9efa49b8cc32fbf"} Mar 18 15:50:49 crc kubenswrapper[4792]: I0318 15:50:49.664778 4792 generic.go:334] "Generic (PLEG): container finished" podID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerID="4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0" exitCode=0 Mar 18 15:50:49 crc kubenswrapper[4792]: I0318 15:50:49.664849 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bd2" event={"ID":"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3","Type":"ContainerDied","Data":"4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0"} Mar 18 15:50:50 crc kubenswrapper[4792]: I0318 15:50:50.673744 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bd2" event={"ID":"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3","Type":"ContainerStarted","Data":"ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a"} Mar 18 15:50:50 crc kubenswrapper[4792]: I0318 15:50:50.695686 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c7bd2" podStartSLOduration=2.26674921 podStartE2EDuration="3.695663638s" podCreationTimestamp="2026-03-18 15:50:47 +0000 UTC" firstStartedPulling="2026-03-18 15:50:48.658527729 +0000 UTC m=+997.527856666" lastFinishedPulling="2026-03-18 15:50:50.087442157 +0000 UTC m=+998.956771094" observedRunningTime="2026-03-18 15:50:50.690164291 +0000 UTC m=+999.559493248" watchObservedRunningTime="2026-03-18 15:50:50.695663638 +0000 UTC m=+999.564992575" Mar 18 15:50:57 crc kubenswrapper[4792]: I0318 15:50:57.850548 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:57 crc kubenswrapper[4792]: I0318 15:50:57.851205 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:57 crc kubenswrapper[4792]: I0318 15:50:57.896506 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:58 crc kubenswrapper[4792]: I0318 15:50:58.773664 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:50:58 crc kubenswrapper[4792]: I0318 15:50:58.822318 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c7bd2"] Mar 18 15:51:00 crc kubenswrapper[4792]: I0318 15:51:00.742282 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c7bd2" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="registry-server" containerID="cri-o://ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a" gracePeriod=2 Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.164086 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.221244 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-772kb\" (UniqueName: \"kubernetes.io/projected/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-kube-api-access-772kb\") pod \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.227227 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-kube-api-access-772kb" (OuterVolumeSpecName: "kube-api-access-772kb") pod "5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" (UID: "5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3"). InnerVolumeSpecName "kube-api-access-772kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.322320 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-utilities\") pod \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.322505 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-catalog-content\") pod \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\" (UID: \"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3\") " Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.322985 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-772kb\" (UniqueName: \"kubernetes.io/projected/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-kube-api-access-772kb\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.323117 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-utilities" (OuterVolumeSpecName: "utilities") pod "5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" (UID: "5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.377212 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" (UID: "5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.425068 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.425110 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.750748 4792 generic.go:334] "Generic (PLEG): container finished" podID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerID="ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a" exitCode=0 Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.750794 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7bd2" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.750809 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bd2" event={"ID":"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3","Type":"ContainerDied","Data":"ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a"} Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.751175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7bd2" event={"ID":"5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3","Type":"ContainerDied","Data":"17c41a51cebec2014ace0970d6a8133d8f3533b1e5068b08d9efa49b8cc32fbf"} Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.751217 4792 scope.go:117] "RemoveContainer" containerID="ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.779290 4792 scope.go:117] "RemoveContainer" containerID="4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.779528 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c7bd2"] Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.783927 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c7bd2"] Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.807598 4792 scope.go:117] "RemoveContainer" containerID="63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.827413 4792 scope.go:117] "RemoveContainer" containerID="ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a" Mar 18 15:51:01 crc kubenswrapper[4792]: E0318 15:51:01.827787 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a\": container with ID starting with ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a not found: ID does not exist" containerID="ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.827824 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a"} err="failed to get container status \"ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a\": rpc error: code = NotFound desc = could not find container \"ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a\": container with ID starting with ebaa0da95aee81cfbb5dc9207ac007e230faa066aff8fa0814f68784d4fb114a not found: ID does not exist" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.827847 4792 scope.go:117] "RemoveContainer" containerID="4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0" Mar 18 15:51:01 crc kubenswrapper[4792]: E0318 15:51:01.828212 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0\": container with ID starting with 4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0 not found: ID does not exist" containerID="4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.828241 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0"} err="failed to get container status \"4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0\": rpc error: code = NotFound desc = could not find container \"4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0\": container with ID starting with 4039a18cb152cfaeb11361a5b2678d86b5ed6d12187a9ef8cb14772891777ab0 not found: ID does not exist" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.828258 4792 scope.go:117] "RemoveContainer" containerID="63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043" Mar 18 15:51:01 crc kubenswrapper[4792]: E0318 15:51:01.828504 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043\": container with ID starting with 63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043 not found: ID does not exist" containerID="63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.828536 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043"} err="failed to get container status \"63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043\": rpc error: code = NotFound desc = could not find container \"63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043\": container with ID starting with 63654b30cbfefc74650aa5f1ec518964f7f51f7e77857f1662b9ef17915e9043 not found: ID does not exist" Mar 18 15:51:01 crc kubenswrapper[4792]: I0318 15:51:01.867450 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" path="/var/lib/kubelet/pods/5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3/volumes" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.309169 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx"] Mar 18 15:51:10 crc kubenswrapper[4792]: E0318 15:51:10.310139 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="extract-utilities" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.310156 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="extract-utilities" Mar 18 15:51:10 crc kubenswrapper[4792]: E0318 15:51:10.310174 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="registry-server" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.310182 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="registry-server" Mar 18 15:51:10 crc kubenswrapper[4792]: E0318 15:51:10.310196 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="extract-content" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.310204 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="extract-content" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.310378 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dca5f4c-7afc-4e6c-8ca8-46a0a54f1fe3" containerName="registry-server" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.311580 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.318992 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.326500 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx"] Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.464581 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.464780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78nkn\" (UniqueName: \"kubernetes.io/projected/a3864929-6390-4703-b97d-50451aae73fe-kube-api-access-78nkn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.464870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.566441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.566554 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78nkn\" (UniqueName: \"kubernetes.io/projected/a3864929-6390-4703-b97d-50451aae73fe-kube-api-access-78nkn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.566600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.567214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.567242 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.585363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78nkn\" (UniqueName: \"kubernetes.io/projected/a3864929-6390-4703-b97d-50451aae73fe-kube-api-access-78nkn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:10 crc kubenswrapper[4792]: I0318 15:51:10.631247 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:11 crc kubenswrapper[4792]: W0318 15:51:11.034448 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3864929_6390_4703_b97d_50451aae73fe.slice/crio-d4cb018956cc09ef817c9ce4fc76d92034d5615cbeb03639d16dbe3c700b4b35 WatchSource:0}: Error finding container d4cb018956cc09ef817c9ce4fc76d92034d5615cbeb03639d16dbe3c700b4b35: Status 404 returned error can't find the container with id d4cb018956cc09ef817c9ce4fc76d92034d5615cbeb03639d16dbe3c700b4b35 Mar 18 15:51:11 crc kubenswrapper[4792]: I0318 15:51:11.035331 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx"] Mar 18 15:51:11 crc kubenswrapper[4792]: I0318 15:51:11.820240 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3864929-6390-4703-b97d-50451aae73fe" containerID="e0e7d6fa033f25a47be26a61a2c2de9fcb7554137f0ecc89a64178dd286cccfc" exitCode=0 Mar 18 15:51:11 crc kubenswrapper[4792]: I0318 15:51:11.820330 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" event={"ID":"a3864929-6390-4703-b97d-50451aae73fe","Type":"ContainerDied","Data":"e0e7d6fa033f25a47be26a61a2c2de9fcb7554137f0ecc89a64178dd286cccfc"} Mar 18 15:51:11 crc kubenswrapper[4792]: I0318 15:51:11.820682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" event={"ID":"a3864929-6390-4703-b97d-50451aae73fe","Type":"ContainerStarted","Data":"d4cb018956cc09ef817c9ce4fc76d92034d5615cbeb03639d16dbe3c700b4b35"} Mar 18 15:51:14 crc kubenswrapper[4792]: I0318 15:51:14.840228 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3864929-6390-4703-b97d-50451aae73fe" containerID="2b3cc2f9604de9e2668a681e8c051ee4b75642a4084ce9b1db1badb7de8d5308" exitCode=0 Mar 18 15:51:14 crc kubenswrapper[4792]: I0318 15:51:14.840330 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" event={"ID":"a3864929-6390-4703-b97d-50451aae73fe","Type":"ContainerDied","Data":"2b3cc2f9604de9e2668a681e8c051ee4b75642a4084ce9b1db1badb7de8d5308"} Mar 18 15:51:15 crc kubenswrapper[4792]: I0318 15:51:15.850752 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3864929-6390-4703-b97d-50451aae73fe" containerID="6f24b8af295dd5d04db1e6be8886f149889651e084e0bf9d6061932913336b9f" exitCode=0 Mar 18 15:51:15 crc kubenswrapper[4792]: I0318 15:51:15.850811 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" event={"ID":"a3864929-6390-4703-b97d-50451aae73fe","Type":"ContainerDied","Data":"6f24b8af295dd5d04db1e6be8886f149889651e084e0bf9d6061932913336b9f"} Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.125841 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.270010 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-util\") pod \"a3864929-6390-4703-b97d-50451aae73fe\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.270065 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78nkn\" (UniqueName: \"kubernetes.io/projected/a3864929-6390-4703-b97d-50451aae73fe-kube-api-access-78nkn\") pod \"a3864929-6390-4703-b97d-50451aae73fe\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.270100 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-bundle\") pod \"a3864929-6390-4703-b97d-50451aae73fe\" (UID: \"a3864929-6390-4703-b97d-50451aae73fe\") " Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.270675 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-bundle" (OuterVolumeSpecName: "bundle") pod "a3864929-6390-4703-b97d-50451aae73fe" (UID: "a3864929-6390-4703-b97d-50451aae73fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.276565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3864929-6390-4703-b97d-50451aae73fe-kube-api-access-78nkn" (OuterVolumeSpecName: "kube-api-access-78nkn") pod "a3864929-6390-4703-b97d-50451aae73fe" (UID: "a3864929-6390-4703-b97d-50451aae73fe"). InnerVolumeSpecName "kube-api-access-78nkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.280883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-util" (OuterVolumeSpecName: "util") pod "a3864929-6390-4703-b97d-50451aae73fe" (UID: "a3864929-6390-4703-b97d-50451aae73fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.372252 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.372286 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78nkn\" (UniqueName: \"kubernetes.io/projected/a3864929-6390-4703-b97d-50451aae73fe-kube-api-access-78nkn\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.372302 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3864929-6390-4703-b97d-50451aae73fe-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.867806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" event={"ID":"a3864929-6390-4703-b97d-50451aae73fe","Type":"ContainerDied","Data":"d4cb018956cc09ef817c9ce4fc76d92034d5615cbeb03639d16dbe3c700b4b35"} Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.867860 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4cb018956cc09ef817c9ce4fc76d92034d5615cbeb03639d16dbe3c700b4b35" Mar 18 15:51:17 crc kubenswrapper[4792]: I0318 15:51:17.867956 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.038997 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-mskk6"] Mar 18 15:51:22 crc kubenswrapper[4792]: E0318 15:51:22.040176 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3864929-6390-4703-b97d-50451aae73fe" containerName="pull" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.040193 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3864929-6390-4703-b97d-50451aae73fe" containerName="pull" Mar 18 15:51:22 crc kubenswrapper[4792]: E0318 15:51:22.040223 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3864929-6390-4703-b97d-50451aae73fe" containerName="extract" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.040233 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3864929-6390-4703-b97d-50451aae73fe" containerName="extract" Mar 18 15:51:22 crc kubenswrapper[4792]: E0318 15:51:22.040245 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3864929-6390-4703-b97d-50451aae73fe" containerName="util" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.040254 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3864929-6390-4703-b97d-50451aae73fe" containerName="util" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.040407 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3864929-6390-4703-b97d-50451aae73fe" containerName="extract" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.041041 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.043609 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.043764 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.046954 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-n9w7j" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.048128 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4cbn\" (UniqueName: \"kubernetes.io/projected/465e5e04-b3e9-4b8c-98dc-abd9f050de38-kube-api-access-t4cbn\") pod \"nmstate-operator-796d4cfff4-mskk6\" (UID: \"465e5e04-b3e9-4b8c-98dc-abd9f050de38\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.057914 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-mskk6"] Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.149962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4cbn\" (UniqueName: \"kubernetes.io/projected/465e5e04-b3e9-4b8c-98dc-abd9f050de38-kube-api-access-t4cbn\") pod \"nmstate-operator-796d4cfff4-mskk6\" (UID: \"465e5e04-b3e9-4b8c-98dc-abd9f050de38\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.168136 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4cbn\" (UniqueName: \"kubernetes.io/projected/465e5e04-b3e9-4b8c-98dc-abd9f050de38-kube-api-access-t4cbn\") pod \"nmstate-operator-796d4cfff4-mskk6\" (UID: \"465e5e04-b3e9-4b8c-98dc-abd9f050de38\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.366493 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" Mar 18 15:51:22 crc kubenswrapper[4792]: I0318 15:51:22.942369 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-mskk6"] Mar 18 15:51:23 crc kubenswrapper[4792]: I0318 15:51:23.907541 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" event={"ID":"465e5e04-b3e9-4b8c-98dc-abd9f050de38","Type":"ContainerStarted","Data":"a756504679f9da4f2479531ae0843e9709a835b2e3f5668d62154fb8318a128d"} Mar 18 15:51:25 crc kubenswrapper[4792]: I0318 15:51:25.925595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" event={"ID":"465e5e04-b3e9-4b8c-98dc-abd9f050de38","Type":"ContainerStarted","Data":"07660f7e44b6765e5e0f0b6edc8e07eb63bece09e7535410c7ccf96520f08281"} Mar 18 15:51:25 crc kubenswrapper[4792]: I0318 15:51:25.947444 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-mskk6" podStartSLOduration=1.165471944 podStartE2EDuration="3.947424458s" podCreationTimestamp="2026-03-18 15:51:22 +0000 UTC" firstStartedPulling="2026-03-18 15:51:22.947655957 +0000 UTC m=+1031.816984894" lastFinishedPulling="2026-03-18 15:51:25.729608471 +0000 UTC m=+1034.598937408" observedRunningTime="2026-03-18 15:51:25.944063329 +0000 UTC m=+1034.813392266" watchObservedRunningTime="2026-03-18 15:51:25.947424458 +0000 UTC m=+1034.816753395" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.244790 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.246763 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.260834 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rc7bv" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.273800 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.311549 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rlx82"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.313318 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.327713 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.335787 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-s4q9g"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.344555 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.385014 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rlx82"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.387358 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dws9p\" (UniqueName: \"kubernetes.io/projected/ae143031-3c99-45c6-a0bf-6e8b8a3c1d14-kube-api-access-dws9p\") pod \"nmstate-metrics-9b8c8685d-kmt7t\" (UID: \"ae143031-3c99-45c6-a0bf-6e8b8a3c1d14\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.488657 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7af72a3d-98a7-4a83-affa-3d382184fc59-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rlx82\" (UID: \"7af72a3d-98a7-4a83-affa-3d382184fc59\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.488880 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmrlk\" (UniqueName: \"kubernetes.io/projected/7af72a3d-98a7-4a83-affa-3d382184fc59-kube-api-access-rmrlk\") pod \"nmstate-webhook-5f558f5558-rlx82\" (UID: \"7af72a3d-98a7-4a83-affa-3d382184fc59\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.488993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-ovs-socket\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.489090 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-dbus-socket\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.489189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw8gv\" (UniqueName: \"kubernetes.io/projected/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-kube-api-access-sw8gv\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.489304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-nmstate-lock\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.489379 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dws9p\" (UniqueName: \"kubernetes.io/projected/ae143031-3c99-45c6-a0bf-6e8b8a3c1d14-kube-api-access-dws9p\") pod \"nmstate-metrics-9b8c8685d-kmt7t\" (UID: \"ae143031-3c99-45c6-a0bf-6e8b8a3c1d14\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.493555 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.494432 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.499667 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.501056 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.502233 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kcwb2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.507952 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.529226 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dws9p\" (UniqueName: \"kubernetes.io/projected/ae143031-3c99-45c6-a0bf-6e8b8a3c1d14-kube-api-access-dws9p\") pod \"nmstate-metrics-9b8c8685d-kmt7t\" (UID: \"ae143031-3c99-45c6-a0bf-6e8b8a3c1d14\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.562407 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591089 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-dbus-socket\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591314 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw8gv\" (UniqueName: \"kubernetes.io/projected/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-kube-api-access-sw8gv\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591446 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-nmstate-lock\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7af72a3d-98a7-4a83-affa-3d382184fc59-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rlx82\" (UID: \"7af72a3d-98a7-4a83-affa-3d382184fc59\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmrlk\" (UniqueName: \"kubernetes.io/projected/7af72a3d-98a7-4a83-affa-3d382184fc59-kube-api-access-rmrlk\") pod \"nmstate-webhook-5f558f5558-rlx82\" (UID: \"7af72a3d-98a7-4a83-affa-3d382184fc59\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591893 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-ovs-socket\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: E0318 15:51:31.591735 4792 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 15:51:31 crc kubenswrapper[4792]: E0318 15:51:31.592164 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af72a3d-98a7-4a83-affa-3d382184fc59-tls-key-pair podName:7af72a3d-98a7-4a83-affa-3d382184fc59 nodeName:}" failed. No retries permitted until 2026-03-18 15:51:32.092142026 +0000 UTC m=+1040.961470963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/7af72a3d-98a7-4a83-affa-3d382184fc59-tls-key-pair") pod "nmstate-webhook-5f558f5558-rlx82" (UID: "7af72a3d-98a7-4a83-affa-3d382184fc59") : secret "openshift-nmstate-webhook" not found Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-dbus-socket\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591954 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-ovs-socket\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.591555 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-nmstate-lock\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.610290 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmrlk\" (UniqueName: \"kubernetes.io/projected/7af72a3d-98a7-4a83-affa-3d382184fc59-kube-api-access-rmrlk\") pod \"nmstate-webhook-5f558f5558-rlx82\" (UID: \"7af72a3d-98a7-4a83-affa-3d382184fc59\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.611419 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw8gv\" (UniqueName: \"kubernetes.io/projected/c7af6f36-f51d-4d49-85d2-5d4081ad57a6-kube-api-access-sw8gv\") pod \"nmstate-handler-s4q9g\" (UID: \"c7af6f36-f51d-4d49-85d2-5d4081ad57a6\") " pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.707216 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6x77\" (UniqueName: \"kubernetes.io/projected/910601c0-aac3-4fe1-9735-90b6329e26c3-kube-api-access-k6x77\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.709881 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.712217 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/910601c0-aac3-4fe1-9735-90b6329e26c3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.712432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/910601c0-aac3-4fe1-9735-90b6329e26c3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.734083 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85f4b56fb6-xnb5g"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.735428 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.761946 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85f4b56fb6-xnb5g"] Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.813061 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-oauth-serving-cert\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-serving-cert\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-oauth-config\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6x77\" (UniqueName: \"kubernetes.io/projected/910601c0-aac3-4fe1-9735-90b6329e26c3-kube-api-access-k6x77\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/910601c0-aac3-4fe1-9735-90b6329e26c3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xzvd\" (UniqueName: \"kubernetes.io/projected/b9bcba05-574e-478e-880d-0f46b4ed2052-kube-api-access-6xzvd\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-console-config\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-service-ca\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-trusted-ca-bundle\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.814701 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/910601c0-aac3-4fe1-9735-90b6329e26c3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.816895 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/910601c0-aac3-4fe1-9735-90b6329e26c3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.836837 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/910601c0-aac3-4fe1-9735-90b6329e26c3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.837829 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6x77\" (UniqueName: \"kubernetes.io/projected/910601c0-aac3-4fe1-9735-90b6329e26c3-kube-api-access-k6x77\") pod \"nmstate-console-plugin-86f58fcf4-g4ld2\" (UID: \"910601c0-aac3-4fe1-9735-90b6329e26c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.916561 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-serving-cert\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.916630 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-oauth-config\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.916772 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xzvd\" (UniqueName: \"kubernetes.io/projected/b9bcba05-574e-478e-880d-0f46b4ed2052-kube-api-access-6xzvd\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.916803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-console-config\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.916868 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-service-ca\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.916924 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-trusted-ca-bundle\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.918011 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-service-ca\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.918231 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-oauth-serving-cert\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.918410 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-console-config\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.918929 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-trusted-ca-bundle\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.923707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-serving-cert\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.937684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-oauth-config\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.938239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-oauth-serving-cert\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.939329 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xzvd\" (UniqueName: \"kubernetes.io/projected/b9bcba05-574e-478e-880d-0f46b4ed2052-kube-api-access-6xzvd\") pod \"console-85f4b56fb6-xnb5g\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:31 crc kubenswrapper[4792]: I0318 15:51:31.966487 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4q9g" event={"ID":"c7af6f36-f51d-4d49-85d2-5d4081ad57a6","Type":"ContainerStarted","Data":"6a34edacb784e80ec8f213e79404b8aec546ee65f1a290054d2219be69f7511b"} Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.101316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.109468 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.121517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7af72a3d-98a7-4a83-affa-3d382184fc59-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rlx82\" (UID: \"7af72a3d-98a7-4a83-affa-3d382184fc59\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.126724 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7af72a3d-98a7-4a83-affa-3d382184fc59-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rlx82\" (UID: \"7af72a3d-98a7-4a83-affa-3d382184fc59\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.137565 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t"] Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.284978 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.549547 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85f4b56fb6-xnb5g"] Mar 18 15:51:32 crc kubenswrapper[4792]: W0318 15:51:32.555222 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bcba05_574e_478e_880d_0f46b4ed2052.slice/crio-2dccd64441942b3b6ab4baa8b27e5b93299c3d7e4a9cdda548e1f5b02f5e8346 WatchSource:0}: Error finding container 2dccd64441942b3b6ab4baa8b27e5b93299c3d7e4a9cdda548e1f5b02f5e8346: Status 404 returned error can't find the container with id 2dccd64441942b3b6ab4baa8b27e5b93299c3d7e4a9cdda548e1f5b02f5e8346 Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.603869 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2"] Mar 18 15:51:32 crc kubenswrapper[4792]: W0318 15:51:32.608520 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod910601c0_aac3_4fe1_9735_90b6329e26c3.slice/crio-66383e2f331b24d692941c5b1b743acfce1263b9e40b3b39c5582d96c9778f97 WatchSource:0}: Error finding container 66383e2f331b24d692941c5b1b743acfce1263b9e40b3b39c5582d96c9778f97: Status 404 returned error can't find the container with id 66383e2f331b24d692941c5b1b743acfce1263b9e40b3b39c5582d96c9778f97 Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.697706 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rlx82"] Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.975920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f4b56fb6-xnb5g" event={"ID":"b9bcba05-574e-478e-880d-0f46b4ed2052","Type":"ContainerStarted","Data":"83ed938be43060ade26a09a9ab31173a73b86a81c312731f1f25b6a6781815d7"} Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.976190 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f4b56fb6-xnb5g" event={"ID":"b9bcba05-574e-478e-880d-0f46b4ed2052","Type":"ContainerStarted","Data":"2dccd64441942b3b6ab4baa8b27e5b93299c3d7e4a9cdda548e1f5b02f5e8346"} Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.976860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" event={"ID":"ae143031-3c99-45c6-a0bf-6e8b8a3c1d14","Type":"ContainerStarted","Data":"b87ceaf030084163795954c318c96eded604335ac83e9f9247720f9f0ec3b697"} Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.978739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" event={"ID":"7af72a3d-98a7-4a83-affa-3d382184fc59","Type":"ContainerStarted","Data":"b4612154356fb7ec0223817ca59e38edfafbf42f9619f5f49b406ee2c202bae1"} Mar 18 15:51:32 crc kubenswrapper[4792]: I0318 15:51:32.979866 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" event={"ID":"910601c0-aac3-4fe1-9735-90b6329e26c3","Type":"ContainerStarted","Data":"66383e2f331b24d692941c5b1b743acfce1263b9e40b3b39c5582d96c9778f97"} Mar 18 15:51:33 crc kubenswrapper[4792]: I0318 15:51:33.005355 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85f4b56fb6-xnb5g" podStartSLOduration=2.005331172 podStartE2EDuration="2.005331172s" podCreationTimestamp="2026-03-18 15:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:51:32.993197304 +0000 UTC m=+1041.862526251" watchObservedRunningTime="2026-03-18 15:51:33.005331172 +0000 UTC m=+1041.874660109" Mar 18 15:51:36 crc kubenswrapper[4792]: I0318 15:51:36.005710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" event={"ID":"ae143031-3c99-45c6-a0bf-6e8b8a3c1d14","Type":"ContainerStarted","Data":"ecc18edb264b277b2c72912982ad1efd7acf32dcc6ff682971af37d86eaa77a2"} Mar 18 15:51:36 crc kubenswrapper[4792]: I0318 15:51:36.008429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" event={"ID":"7af72a3d-98a7-4a83-affa-3d382184fc59","Type":"ContainerStarted","Data":"e959f5f01bc76d7b0983909441e973631e6b09c932cebdd4ed281797c4c6c122"} Mar 18 15:51:36 crc kubenswrapper[4792]: I0318 15:51:36.008476 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:51:36 crc kubenswrapper[4792]: I0318 15:51:36.010638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4q9g" event={"ID":"c7af6f36-f51d-4d49-85d2-5d4081ad57a6","Type":"ContainerStarted","Data":"9d919263a95b10e402812f16f5ec12cd5ca9fe8b3b90cc17c1c4f5ba91d94059"} Mar 18 15:51:36 crc kubenswrapper[4792]: I0318 15:51:36.010757 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:36 crc kubenswrapper[4792]: I0318 15:51:36.027609 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" podStartSLOduration=2.527972756 podStartE2EDuration="5.027588326s" podCreationTimestamp="2026-03-18 15:51:31 +0000 UTC" firstStartedPulling="2026-03-18 15:51:32.698736206 +0000 UTC m=+1041.568065143" lastFinishedPulling="2026-03-18 15:51:35.198351776 +0000 UTC m=+1044.067680713" observedRunningTime="2026-03-18 15:51:36.023609929 +0000 UTC m=+1044.892938876" watchObservedRunningTime="2026-03-18 15:51:36.027588326 +0000 UTC m=+1044.896917263" Mar 18 15:51:38 crc kubenswrapper[4792]: I0318 15:51:38.029496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" event={"ID":"910601c0-aac3-4fe1-9735-90b6329e26c3","Type":"ContainerStarted","Data":"dfa575f41d482bd3e699a1e1bfa0094fceeb9c1bc452fbb3cd132b935c0f84b1"} Mar 18 15:51:38 crc kubenswrapper[4792]: I0318 15:51:38.062539 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-s4q9g" podStartSLOduration=3.642823349 podStartE2EDuration="7.062515209s" podCreationTimestamp="2026-03-18 15:51:31 +0000 UTC" firstStartedPulling="2026-03-18 15:51:31.768791371 +0000 UTC m=+1040.638120308" lastFinishedPulling="2026-03-18 15:51:35.188483231 +0000 UTC m=+1044.057812168" observedRunningTime="2026-03-18 15:51:36.04640865 +0000 UTC m=+1044.915737587" watchObservedRunningTime="2026-03-18 15:51:38.062515209 +0000 UTC m=+1046.931844146" Mar 18 15:51:38 crc kubenswrapper[4792]: I0318 15:51:38.063215 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g4ld2" podStartSLOduration=2.900019027 podStartE2EDuration="7.063207881s" podCreationTimestamp="2026-03-18 15:51:31 +0000 UTC" firstStartedPulling="2026-03-18 15:51:32.611200624 +0000 UTC m=+1041.480529551" lastFinishedPulling="2026-03-18 15:51:36.774389458 +0000 UTC m=+1045.643718405" observedRunningTime="2026-03-18 15:51:38.051436024 +0000 UTC m=+1046.920764981" watchObservedRunningTime="2026-03-18 15:51:38.063207881 +0000 UTC m=+1046.932536818" Mar 18 15:51:39 crc kubenswrapper[4792]: I0318 15:51:39.037184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" event={"ID":"ae143031-3c99-45c6-a0bf-6e8b8a3c1d14","Type":"ContainerStarted","Data":"2705add8d78ccf49995d36c2e9ea6566be575837c87933fd12178a1b8240187f"} Mar 18 15:51:39 crc kubenswrapper[4792]: I0318 15:51:39.060417 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-kmt7t" podStartSLOduration=2.068161755 podStartE2EDuration="8.060396819s" podCreationTimestamp="2026-03-18 15:51:31 +0000 UTC" firstStartedPulling="2026-03-18 15:51:32.140988109 +0000 UTC m=+1041.010317056" lastFinishedPulling="2026-03-18 15:51:38.133223183 +0000 UTC m=+1047.002552120" observedRunningTime="2026-03-18 15:51:39.052436544 +0000 UTC m=+1047.921765581" watchObservedRunningTime="2026-03-18 15:51:39.060396819 +0000 UTC m=+1047.929725756" Mar 18 15:51:41 crc kubenswrapper[4792]: I0318 15:51:41.733472 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-s4q9g" Mar 18 15:51:42 crc kubenswrapper[4792]: I0318 15:51:42.103651 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:42 crc kubenswrapper[4792]: I0318 15:51:42.103691 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:42 crc kubenswrapper[4792]: I0318 15:51:42.108527 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:43 crc kubenswrapper[4792]: I0318 15:51:43.071996 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:51:43 crc kubenswrapper[4792]: I0318 15:51:43.129317 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54d8f469b-rjgv6"] Mar 18 15:51:52 crc kubenswrapper[4792]: I0318 15:51:52.292143 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.137899 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564152-9mw9z"] Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.140172 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.144703 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.144992 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.145930 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.150793 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-9mw9z"] Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.206912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqrwc\" (UniqueName: \"kubernetes.io/projected/3cab244a-8839-4173-930c-b5fed3a6fde1-kube-api-access-hqrwc\") pod \"auto-csr-approver-29564152-9mw9z\" (UID: \"3cab244a-8839-4173-930c-b5fed3a6fde1\") " pod="openshift-infra/auto-csr-approver-29564152-9mw9z" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.307922 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqrwc\" (UniqueName: \"kubernetes.io/projected/3cab244a-8839-4173-930c-b5fed3a6fde1-kube-api-access-hqrwc\") pod \"auto-csr-approver-29564152-9mw9z\" (UID: \"3cab244a-8839-4173-930c-b5fed3a6fde1\") " pod="openshift-infra/auto-csr-approver-29564152-9mw9z" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.327474 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqrwc\" (UniqueName: \"kubernetes.io/projected/3cab244a-8839-4173-930c-b5fed3a6fde1-kube-api-access-hqrwc\") pod \"auto-csr-approver-29564152-9mw9z\" (UID: \"3cab244a-8839-4173-930c-b5fed3a6fde1\") " pod="openshift-infra/auto-csr-approver-29564152-9mw9z" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.480603 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" Mar 18 15:52:00 crc kubenswrapper[4792]: I0318 15:52:00.709070 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-9mw9z"] Mar 18 15:52:01 crc kubenswrapper[4792]: I0318 15:52:01.232133 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" event={"ID":"3cab244a-8839-4173-930c-b5fed3a6fde1","Type":"ContainerStarted","Data":"d3f2d2bfe13af820e8492fc31f95b1e4ffdc8f492815528509a67e638fd1ed58"} Mar 18 15:52:02 crc kubenswrapper[4792]: I0318 15:52:02.250076 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" event={"ID":"3cab244a-8839-4173-930c-b5fed3a6fde1","Type":"ContainerStarted","Data":"1766b4dff7fc2fd9360828e962dc090b101b583f2a29d299479f530191326bd7"} Mar 18 15:52:02 crc kubenswrapper[4792]: I0318 15:52:02.270212 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" podStartSLOduration=1.359675267 podStartE2EDuration="2.270193949s" podCreationTimestamp="2026-03-18 15:52:00 +0000 UTC" firstStartedPulling="2026-03-18 15:52:00.717820967 +0000 UTC m=+1069.587149894" lastFinishedPulling="2026-03-18 15:52:01.628339639 +0000 UTC m=+1070.497668576" observedRunningTime="2026-03-18 15:52:02.26272918 +0000 UTC m=+1071.132058127" watchObservedRunningTime="2026-03-18 15:52:02.270193949 +0000 UTC m=+1071.139522886" Mar 18 15:52:03 crc kubenswrapper[4792]: I0318 15:52:03.258608 4792 generic.go:334] "Generic (PLEG): container finished" podID="3cab244a-8839-4173-930c-b5fed3a6fde1" containerID="1766b4dff7fc2fd9360828e962dc090b101b583f2a29d299479f530191326bd7" exitCode=0 Mar 18 15:52:03 crc kubenswrapper[4792]: I0318 15:52:03.258708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" event={"ID":"3cab244a-8839-4173-930c-b5fed3a6fde1","Type":"ContainerDied","Data":"1766b4dff7fc2fd9360828e962dc090b101b583f2a29d299479f530191326bd7"} Mar 18 15:52:04 crc kubenswrapper[4792]: I0318 15:52:04.702341 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" Mar 18 15:52:04 crc kubenswrapper[4792]: I0318 15:52:04.736504 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqrwc\" (UniqueName: \"kubernetes.io/projected/3cab244a-8839-4173-930c-b5fed3a6fde1-kube-api-access-hqrwc\") pod \"3cab244a-8839-4173-930c-b5fed3a6fde1\" (UID: \"3cab244a-8839-4173-930c-b5fed3a6fde1\") " Mar 18 15:52:04 crc kubenswrapper[4792]: I0318 15:52:04.763207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cab244a-8839-4173-930c-b5fed3a6fde1-kube-api-access-hqrwc" (OuterVolumeSpecName: "kube-api-access-hqrwc") pod "3cab244a-8839-4173-930c-b5fed3a6fde1" (UID: "3cab244a-8839-4173-930c-b5fed3a6fde1"). InnerVolumeSpecName "kube-api-access-hqrwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:04 crc kubenswrapper[4792]: I0318 15:52:04.838206 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqrwc\" (UniqueName: \"kubernetes.io/projected/3cab244a-8839-4173-930c-b5fed3a6fde1-kube-api-access-hqrwc\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:04 crc kubenswrapper[4792]: I0318 15:52:04.947360 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-z6l4d"] Mar 18 15:52:04 crc kubenswrapper[4792]: I0318 15:52:04.952508 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564146-z6l4d"] Mar 18 15:52:05 crc kubenswrapper[4792]: I0318 15:52:05.277072 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" event={"ID":"3cab244a-8839-4173-930c-b5fed3a6fde1","Type":"ContainerDied","Data":"d3f2d2bfe13af820e8492fc31f95b1e4ffdc8f492815528509a67e638fd1ed58"} Mar 18 15:52:05 crc kubenswrapper[4792]: I0318 15:52:05.277116 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3f2d2bfe13af820e8492fc31f95b1e4ffdc8f492815528509a67e638fd1ed58" Mar 18 15:52:05 crc kubenswrapper[4792]: I0318 15:52:05.277131 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564152-9mw9z" Mar 18 15:52:05 crc kubenswrapper[4792]: I0318 15:52:05.869498 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a82a2eb-cb6d-4ccd-b7f0-610978943926" path="/var/lib/kubelet/pods/9a82a2eb-cb6d-4ccd-b7f0-610978943926/volumes" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.180877 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-54d8f469b-rjgv6" podUID="149e0c91-96bc-4c5d-9f3f-e2558cc912cb" containerName="console" containerID="cri-o://145dbc6aba298088d9997334d38be5dbc21489a12a9795bb914d5bdc94322564" gracePeriod=15 Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.326771 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54d8f469b-rjgv6_149e0c91-96bc-4c5d-9f3f-e2558cc912cb/console/0.log" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.326848 4792 generic.go:334] "Generic (PLEG): container finished" podID="149e0c91-96bc-4c5d-9f3f-e2558cc912cb" containerID="145dbc6aba298088d9997334d38be5dbc21489a12a9795bb914d5bdc94322564" exitCode=2 Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.326893 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54d8f469b-rjgv6" event={"ID":"149e0c91-96bc-4c5d-9f3f-e2558cc912cb","Type":"ContainerDied","Data":"145dbc6aba298088d9997334d38be5dbc21489a12a9795bb914d5bdc94322564"} Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.603279 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54d8f469b-rjgv6_149e0c91-96bc-4c5d-9f3f-e2558cc912cb/console/0.log" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.603613 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.707504 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-oauth-config\") pod \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.707599 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-trusted-ca-bundle\") pod \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.707693 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-oauth-serving-cert\") pod \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.707734 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-serving-cert\") pod \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.707774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j97w4\" (UniqueName: \"kubernetes.io/projected/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-kube-api-access-j97w4\") pod \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.707810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-service-ca\") pod \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.707875 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-config\") pod \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\" (UID: \"149e0c91-96bc-4c5d-9f3f-e2558cc912cb\") " Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.708408 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "149e0c91-96bc-4c5d-9f3f-e2558cc912cb" (UID: "149e0c91-96bc-4c5d-9f3f-e2558cc912cb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.708458 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "149e0c91-96bc-4c5d-9f3f-e2558cc912cb" (UID: "149e0c91-96bc-4c5d-9f3f-e2558cc912cb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.708512 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-config" (OuterVolumeSpecName: "console-config") pod "149e0c91-96bc-4c5d-9f3f-e2558cc912cb" (UID: "149e0c91-96bc-4c5d-9f3f-e2558cc912cb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.708820 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-service-ca" (OuterVolumeSpecName: "service-ca") pod "149e0c91-96bc-4c5d-9f3f-e2558cc912cb" (UID: "149e0c91-96bc-4c5d-9f3f-e2558cc912cb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.716810 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "149e0c91-96bc-4c5d-9f3f-e2558cc912cb" (UID: "149e0c91-96bc-4c5d-9f3f-e2558cc912cb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.734156 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-kube-api-access-j97w4" (OuterVolumeSpecName: "kube-api-access-j97w4") pod "149e0c91-96bc-4c5d-9f3f-e2558cc912cb" (UID: "149e0c91-96bc-4c5d-9f3f-e2558cc912cb"). InnerVolumeSpecName "kube-api-access-j97w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.738491 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "149e0c91-96bc-4c5d-9f3f-e2558cc912cb" (UID: "149e0c91-96bc-4c5d-9f3f-e2558cc912cb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.810030 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.810082 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.810096 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j97w4\" (UniqueName: \"kubernetes.io/projected/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-kube-api-access-j97w4\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.810111 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.810123 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.810134 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:08 crc kubenswrapper[4792]: I0318 15:52:08.810144 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149e0c91-96bc-4c5d-9f3f-e2558cc912cb-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:09 crc kubenswrapper[4792]: I0318 15:52:09.336012 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54d8f469b-rjgv6_149e0c91-96bc-4c5d-9f3f-e2558cc912cb/console/0.log" Mar 18 15:52:09 crc kubenswrapper[4792]: I0318 15:52:09.336068 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54d8f469b-rjgv6" event={"ID":"149e0c91-96bc-4c5d-9f3f-e2558cc912cb","Type":"ContainerDied","Data":"f02c5d09e58e2e8530d56bb513d73369d42f0bac9176843bc75ab48e27875341"} Mar 18 15:52:09 crc kubenswrapper[4792]: I0318 15:52:09.336111 4792 scope.go:117] "RemoveContainer" containerID="145dbc6aba298088d9997334d38be5dbc21489a12a9795bb914d5bdc94322564" Mar 18 15:52:09 crc kubenswrapper[4792]: I0318 15:52:09.336181 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54d8f469b-rjgv6" Mar 18 15:52:09 crc kubenswrapper[4792]: I0318 15:52:09.375952 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54d8f469b-rjgv6"] Mar 18 15:52:09 crc kubenswrapper[4792]: I0318 15:52:09.384425 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54d8f469b-rjgv6"] Mar 18 15:52:09 crc kubenswrapper[4792]: I0318 15:52:09.863715 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149e0c91-96bc-4c5d-9f3f-e2558cc912cb" path="/var/lib/kubelet/pods/149e0c91-96bc-4c5d-9f3f-e2558cc912cb/volumes" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.949197 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k"] Mar 18 15:52:10 crc kubenswrapper[4792]: E0318 15:52:10.949680 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149e0c91-96bc-4c5d-9f3f-e2558cc912cb" containerName="console" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.949693 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="149e0c91-96bc-4c5d-9f3f-e2558cc912cb" containerName="console" Mar 18 15:52:10 crc kubenswrapper[4792]: E0318 15:52:10.949704 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cab244a-8839-4173-930c-b5fed3a6fde1" containerName="oc" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.949712 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cab244a-8839-4173-930c-b5fed3a6fde1" containerName="oc" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.949867 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="149e0c91-96bc-4c5d-9f3f-e2558cc912cb" containerName="console" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.949882 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cab244a-8839-4173-930c-b5fed3a6fde1" containerName="oc" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.950864 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.956564 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 15:52:10 crc kubenswrapper[4792]: I0318 15:52:10.959092 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k"] Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.042337 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.042400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6gnf\" (UniqueName: \"kubernetes.io/projected/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-kube-api-access-l6gnf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.042447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.144408 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.144593 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6gnf\" (UniqueName: \"kubernetes.io/projected/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-kube-api-access-l6gnf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.144698 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.145575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.145763 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.166651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6gnf\" (UniqueName: \"kubernetes.io/projected/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-kube-api-access-l6gnf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.270359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:11 crc kubenswrapper[4792]: I0318 15:52:11.668173 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k"] Mar 18 15:52:12 crc kubenswrapper[4792]: I0318 15:52:12.370883 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerID="e1de5a0d95d3c4dde342f711a432ef89f4bce76eac47d652fa617e1d5a61260e" exitCode=0 Mar 18 15:52:12 crc kubenswrapper[4792]: I0318 15:52:12.371022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" event={"ID":"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3","Type":"ContainerDied","Data":"e1de5a0d95d3c4dde342f711a432ef89f4bce76eac47d652fa617e1d5a61260e"} Mar 18 15:52:12 crc kubenswrapper[4792]: I0318 15:52:12.371258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" event={"ID":"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3","Type":"ContainerStarted","Data":"0ca4dde3de5bf0f9195f2dc68eba0c774526b4b6856c342243379e518cee7fe5"} Mar 18 15:52:14 crc kubenswrapper[4792]: I0318 15:52:14.391308 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerID="fb9ca16e02c54d512f6066024a7961ec266c543c2953ceec6670e535c42a1870" exitCode=0 Mar 18 15:52:14 crc kubenswrapper[4792]: I0318 15:52:14.391418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" event={"ID":"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3","Type":"ContainerDied","Data":"fb9ca16e02c54d512f6066024a7961ec266c543c2953ceec6670e535c42a1870"} Mar 18 15:52:15 crc kubenswrapper[4792]: I0318 15:52:15.405814 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerID="3a2eb618000da6a18ea69c9950a858d0bc66227967e9555ac33da455b00895d6" exitCode=0 Mar 18 15:52:15 crc kubenswrapper[4792]: I0318 15:52:15.405864 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" event={"ID":"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3","Type":"ContainerDied","Data":"3a2eb618000da6a18ea69c9950a858d0bc66227967e9555ac33da455b00895d6"} Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.743486 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.845433 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6gnf\" (UniqueName: \"kubernetes.io/projected/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-kube-api-access-l6gnf\") pod \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.846178 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-bundle\") pod \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.846240 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-util\") pod \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\" (UID: \"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3\") " Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.848090 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-bundle" (OuterVolumeSpecName: "bundle") pod "e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" (UID: "e6dbd030-bd91-4ac2-9140-3bc0bc5214b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.854074 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-kube-api-access-l6gnf" (OuterVolumeSpecName: "kube-api-access-l6gnf") pod "e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" (UID: "e6dbd030-bd91-4ac2-9140-3bc0bc5214b3"). InnerVolumeSpecName "kube-api-access-l6gnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.860283 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-util" (OuterVolumeSpecName: "util") pod "e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" (UID: "e6dbd030-bd91-4ac2-9140-3bc0bc5214b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.948471 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6gnf\" (UniqueName: \"kubernetes.io/projected/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-kube-api-access-l6gnf\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.948524 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:16 crc kubenswrapper[4792]: I0318 15:52:16.948535 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6dbd030-bd91-4ac2-9140-3bc0bc5214b3-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:52:17 crc kubenswrapper[4792]: I0318 15:52:17.422962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" event={"ID":"e6dbd030-bd91-4ac2-9140-3bc0bc5214b3","Type":"ContainerDied","Data":"0ca4dde3de5bf0f9195f2dc68eba0c774526b4b6856c342243379e518cee7fe5"} Mar 18 15:52:17 crc kubenswrapper[4792]: I0318 15:52:17.423286 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca4dde3de5bf0f9195f2dc68eba0c774526b4b6856c342243379e518cee7fe5" Mar 18 15:52:17 crc kubenswrapper[4792]: I0318 15:52:17.423030 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.422270 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z"] Mar 18 15:52:25 crc kubenswrapper[4792]: E0318 15:52:25.423127 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerName="pull" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.423142 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerName="pull" Mar 18 15:52:25 crc kubenswrapper[4792]: E0318 15:52:25.423158 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerName="util" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.423164 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerName="util" Mar 18 15:52:25 crc kubenswrapper[4792]: E0318 15:52:25.423180 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerName="extract" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.423188 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerName="extract" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.423367 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6dbd030-bd91-4ac2-9140-3bc0bc5214b3" containerName="extract" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.423888 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.428927 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.428927 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.429616 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fdsbc" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.429645 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.429981 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.455346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z"] Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.476015 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-apiservice-cert\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.476082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-webhook-cert\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.476114 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzggx\" (UniqueName: \"kubernetes.io/projected/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-kube-api-access-zzggx\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.578216 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-apiservice-cert\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.578285 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-webhook-cert\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.578308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzggx\" (UniqueName: \"kubernetes.io/projected/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-kube-api-access-zzggx\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.586169 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-webhook-cert\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.591728 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-apiservice-cert\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.594823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzggx\" (UniqueName: \"kubernetes.io/projected/ae1d2de8-ac87-4f0e-97c5-3bbb88279055-kube-api-access-zzggx\") pod \"metallb-operator-controller-manager-7cfbbd978d-5f96z\" (UID: \"ae1d2de8-ac87-4f0e-97c5-3bbb88279055\") " pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.747956 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.748892 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85"] Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.749817 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.751779 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bqr27" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.752111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.754668 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.762865 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85"] Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.781693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bdbd945-a92a-471b-8a37-c999fe503caa-apiservice-cert\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.781775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jkd\" (UniqueName: \"kubernetes.io/projected/8bdbd945-a92a-471b-8a37-c999fe503caa-kube-api-access-l9jkd\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.781860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bdbd945-a92a-471b-8a37-c999fe503caa-webhook-cert\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.882716 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bdbd945-a92a-471b-8a37-c999fe503caa-webhook-cert\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.882773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bdbd945-a92a-471b-8a37-c999fe503caa-apiservice-cert\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.882832 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jkd\" (UniqueName: \"kubernetes.io/projected/8bdbd945-a92a-471b-8a37-c999fe503caa-kube-api-access-l9jkd\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.889785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bdbd945-a92a-471b-8a37-c999fe503caa-webhook-cert\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.898603 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bdbd945-a92a-471b-8a37-c999fe503caa-apiservice-cert\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:25 crc kubenswrapper[4792]: I0318 15:52:25.907542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jkd\" (UniqueName: \"kubernetes.io/projected/8bdbd945-a92a-471b-8a37-c999fe503caa-kube-api-access-l9jkd\") pod \"metallb-operator-webhook-server-78dddf6df5-kxk85\" (UID: \"8bdbd945-a92a-471b-8a37-c999fe503caa\") " pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:26 crc kubenswrapper[4792]: I0318 15:52:26.133472 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:26 crc kubenswrapper[4792]: I0318 15:52:26.295715 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z"] Mar 18 15:52:26 crc kubenswrapper[4792]: I0318 15:52:26.308554 4792 scope.go:117] "RemoveContainer" containerID="75baa09b269232bc32b552f31ef7a606716742b9b39d7045a77c8c68788bd6da" Mar 18 15:52:26 crc kubenswrapper[4792]: W0318 15:52:26.324681 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1d2de8_ac87_4f0e_97c5_3bbb88279055.slice/crio-3fc909c79bd51fc40c4270a447ff2b653a17ac38db5ae7f7cc06694f8faa74a1 WatchSource:0}: Error finding container 3fc909c79bd51fc40c4270a447ff2b653a17ac38db5ae7f7cc06694f8faa74a1: Status 404 returned error can't find the container with id 3fc909c79bd51fc40c4270a447ff2b653a17ac38db5ae7f7cc06694f8faa74a1 Mar 18 15:52:26 crc kubenswrapper[4792]: I0318 15:52:26.501079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" event={"ID":"ae1d2de8-ac87-4f0e-97c5-3bbb88279055","Type":"ContainerStarted","Data":"3fc909c79bd51fc40c4270a447ff2b653a17ac38db5ae7f7cc06694f8faa74a1"} Mar 18 15:52:26 crc kubenswrapper[4792]: I0318 15:52:26.589344 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85"] Mar 18 15:52:26 crc kubenswrapper[4792]: W0318 15:52:26.602031 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bdbd945_a92a_471b_8a37_c999fe503caa.slice/crio-abebae8f5dd368a62e569d160f71ead4106e0d55750baa43fc15b79c214e2c5b WatchSource:0}: Error finding container abebae8f5dd368a62e569d160f71ead4106e0d55750baa43fc15b79c214e2c5b: Status 404 returned error can't find the container with id abebae8f5dd368a62e569d160f71ead4106e0d55750baa43fc15b79c214e2c5b Mar 18 15:52:27 crc kubenswrapper[4792]: I0318 15:52:27.520912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" event={"ID":"8bdbd945-a92a-471b-8a37-c999fe503caa","Type":"ContainerStarted","Data":"abebae8f5dd368a62e569d160f71ead4106e0d55750baa43fc15b79c214e2c5b"} Mar 18 15:52:30 crc kubenswrapper[4792]: I0318 15:52:30.322237 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:52:30 crc kubenswrapper[4792]: I0318 15:52:30.326025 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:52:34 crc kubenswrapper[4792]: I0318 15:52:34.592031 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" event={"ID":"ae1d2de8-ac87-4f0e-97c5-3bbb88279055","Type":"ContainerStarted","Data":"4b346a0aee5fa66b859a2a5f724e282e700e0798f7a48b37c59703fc45893388"} Mar 18 15:52:34 crc kubenswrapper[4792]: I0318 15:52:34.592654 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:52:34 crc kubenswrapper[4792]: I0318 15:52:34.593784 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" event={"ID":"8bdbd945-a92a-471b-8a37-c999fe503caa","Type":"ContainerStarted","Data":"0ef9a6c02cb42a9cb1cbd9787645e01c96b47b8327fc0f29833cccc25ec8fa05"} Mar 18 15:52:34 crc kubenswrapper[4792]: I0318 15:52:34.593886 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:52:34 crc kubenswrapper[4792]: I0318 15:52:34.618278 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" podStartSLOduration=2.047779479 podStartE2EDuration="9.618256074s" podCreationTimestamp="2026-03-18 15:52:25 +0000 UTC" firstStartedPulling="2026-03-18 15:52:26.326440443 +0000 UTC m=+1095.195769380" lastFinishedPulling="2026-03-18 15:52:33.896917028 +0000 UTC m=+1102.766245975" observedRunningTime="2026-03-18 15:52:34.611770096 +0000 UTC m=+1103.481099033" watchObservedRunningTime="2026-03-18 15:52:34.618256074 +0000 UTC m=+1103.487585011" Mar 18 15:52:34 crc kubenswrapper[4792]: I0318 15:52:34.632604 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" podStartSLOduration=2.342441483 podStartE2EDuration="9.632584173s" podCreationTimestamp="2026-03-18 15:52:25 +0000 UTC" firstStartedPulling="2026-03-18 15:52:26.605492987 +0000 UTC m=+1095.474821924" lastFinishedPulling="2026-03-18 15:52:33.895635667 +0000 UTC m=+1102.764964614" observedRunningTime="2026-03-18 15:52:34.62844998 +0000 UTC m=+1103.497778917" watchObservedRunningTime="2026-03-18 15:52:34.632584173 +0000 UTC m=+1103.501913110" Mar 18 15:52:46 crc kubenswrapper[4792]: I0318 15:52:46.138182 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" Mar 18 15:53:00 crc kubenswrapper[4792]: I0318 15:53:00.322181 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:53:00 crc kubenswrapper[4792]: I0318 15:53:00.322723 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:53:05 crc kubenswrapper[4792]: I0318 15:53:05.750441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.452040 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc"] Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.453452 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.456153 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-r2x8f" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.457111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.463129 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kvj8m"] Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.467525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.472399 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.472709 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.474313 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc"] Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.555832 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kgc9b"] Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.557526 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.559616 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9v95t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.564734 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.564993 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.571581 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.574571 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-9k68t"] Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.576412 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.580612 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.594435 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-9k68t"] Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598133 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-metrics\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598171 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-reloader\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598192 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-startup\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598217 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-metrics-certs\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598254 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-conf\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhx57\" (UniqueName: \"kubernetes.io/projected/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-kube-api-access-hhx57\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8mz\" (UniqueName: \"kubernetes.io/projected/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-kube-api-access-bx8mz\") pod \"frr-k8s-webhook-server-bcc4b6f68-86pbc\" (UID: \"423d82c6-fd0b-4cb5-8ff2-501f479a9a73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598335 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-sockets\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.598365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-86pbc\" (UID: \"423d82c6-fd0b-4cb5-8ff2-501f479a9a73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700194 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-conf\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700271 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4krk\" (UniqueName: \"kubernetes.io/projected/260602fc-bedf-40ec-92e7-a96e3ee009f0-kube-api-access-r4krk\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700290 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ac5ba665-4ead-4469-9d1c-c777bf26d579-metallb-excludel2\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhx57\" (UniqueName: \"kubernetes.io/projected/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-kube-api-access-hhx57\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8mz\" (UniqueName: \"kubernetes.io/projected/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-kube-api-access-bx8mz\") pod \"frr-k8s-webhook-server-bcc4b6f68-86pbc\" (UID: \"423d82c6-fd0b-4cb5-8ff2-501f479a9a73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700417 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvln4\" (UniqueName: \"kubernetes.io/projected/ac5ba665-4ead-4469-9d1c-c777bf26d579-kube-api-access-gvln4\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700442 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-sockets\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-86pbc\" (UID: \"423d82c6-fd0b-4cb5-8ff2-501f479a9a73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700561 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-metrics\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-metrics-certs\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700632 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-reloader\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-startup\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700670 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/260602fc-bedf-40ec-92e7-a96e3ee009f0-cert\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-metrics-certs\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.700729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260602fc-bedf-40ec-92e7-a96e3ee009f0-metrics-certs\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.701338 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-conf\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: E0318 15:53:06.702455 4792 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 18 15:53:06 crc kubenswrapper[4792]: E0318 15:53:06.702550 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-cert podName:423d82c6-fd0b-4cb5-8ff2-501f479a9a73 nodeName:}" failed. No retries permitted until 2026-03-18 15:53:07.2025291 +0000 UTC m=+1136.071858037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-cert") pod "frr-k8s-webhook-server-bcc4b6f68-86pbc" (UID: "423d82c6-fd0b-4cb5-8ff2-501f479a9a73") : secret "frr-k8s-webhook-server-cert" not found Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.702671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-metrics\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.703174 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-startup\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.703233 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-frr-sockets\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.703410 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-reloader\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.709368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-metrics-certs\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.729506 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8mz\" (UniqueName: \"kubernetes.io/projected/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-kube-api-access-bx8mz\") pod \"frr-k8s-webhook-server-bcc4b6f68-86pbc\" (UID: \"423d82c6-fd0b-4cb5-8ff2-501f479a9a73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.734241 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhx57\" (UniqueName: \"kubernetes.io/projected/fb6ddafa-95ff-43b2-be7b-352a7fab9d05-kube-api-access-hhx57\") pod \"frr-k8s-kvj8m\" (UID: \"fb6ddafa-95ff-43b2-be7b-352a7fab9d05\") " pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.793478 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.802038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4krk\" (UniqueName: \"kubernetes.io/projected/260602fc-bedf-40ec-92e7-a96e3ee009f0-kube-api-access-r4krk\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.802083 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ac5ba665-4ead-4469-9d1c-c777bf26d579-metallb-excludel2\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.802128 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.802159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvln4\" (UniqueName: \"kubernetes.io/projected/ac5ba665-4ead-4469-9d1c-c777bf26d579-kube-api-access-gvln4\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.802243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-metrics-certs\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.802268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/260602fc-bedf-40ec-92e7-a96e3ee009f0-cert\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.802295 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260602fc-bedf-40ec-92e7-a96e3ee009f0-metrics-certs\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: E0318 15:53:06.804154 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 15:53:06 crc kubenswrapper[4792]: E0318 15:53:06.804226 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist podName:ac5ba665-4ead-4469-9d1c-c777bf26d579 nodeName:}" failed. No retries permitted until 2026-03-18 15:53:07.304205686 +0000 UTC m=+1136.173534693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist") pod "speaker-kgc9b" (UID: "ac5ba665-4ead-4469-9d1c-c777bf26d579") : secret "metallb-memberlist" not found Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.804660 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ac5ba665-4ead-4469-9d1c-c777bf26d579-metallb-excludel2\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.806423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-metrics-certs\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.807074 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/260602fc-bedf-40ec-92e7-a96e3ee009f0-cert\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.807321 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260602fc-bedf-40ec-92e7-a96e3ee009f0-metrics-certs\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.821684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvln4\" (UniqueName: \"kubernetes.io/projected/ac5ba665-4ead-4469-9d1c-c777bf26d579-kube-api-access-gvln4\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.824443 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4krk\" (UniqueName: \"kubernetes.io/projected/260602fc-bedf-40ec-92e7-a96e3ee009f0-kube-api-access-r4krk\") pod \"controller-7bb4cc7c98-9k68t\" (UID: \"260602fc-bedf-40ec-92e7-a96e3ee009f0\") " pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:06 crc kubenswrapper[4792]: I0318 15:53:06.896840 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.212633 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-86pbc\" (UID: \"423d82c6-fd0b-4cb5-8ff2-501f479a9a73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.217555 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/423d82c6-fd0b-4cb5-8ff2-501f479a9a73-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-86pbc\" (UID: \"423d82c6-fd0b-4cb5-8ff2-501f479a9a73\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.314316 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:07 crc kubenswrapper[4792]: E0318 15:53:07.314462 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 15:53:07 crc kubenswrapper[4792]: E0318 15:53:07.314624 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist podName:ac5ba665-4ead-4469-9d1c-c777bf26d579 nodeName:}" failed. No retries permitted until 2026-03-18 15:53:08.314605178 +0000 UTC m=+1137.183934115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist") pod "speaker-kgc9b" (UID: "ac5ba665-4ead-4469-9d1c-c777bf26d579") : secret "metallb-memberlist" not found Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.330217 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-9k68t"] Mar 18 15:53:07 crc kubenswrapper[4792]: W0318 15:53:07.335245 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod260602fc_bedf_40ec_92e7_a96e3ee009f0.slice/crio-e4011c79670629671db14d08ef29671a026edb80bbdfefc3cd140d46a990977d WatchSource:0}: Error finding container e4011c79670629671db14d08ef29671a026edb80bbdfefc3cd140d46a990977d: Status 404 returned error can't find the container with id e4011c79670629671db14d08ef29671a026edb80bbdfefc3cd140d46a990977d Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.373397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.790427 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc"] Mar 18 15:53:07 crc kubenswrapper[4792]: W0318 15:53:07.792995 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod423d82c6_fd0b_4cb5_8ff2_501f479a9a73.slice/crio-3e0345cb1fb14937110e5376401037f2fa44b5fe3237178b48254e6e15d06525 WatchSource:0}: Error finding container 3e0345cb1fb14937110e5376401037f2fa44b5fe3237178b48254e6e15d06525: Status 404 returned error can't find the container with id 3e0345cb1fb14937110e5376401037f2fa44b5fe3237178b48254e6e15d06525 Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.830032 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-9k68t" event={"ID":"260602fc-bedf-40ec-92e7-a96e3ee009f0","Type":"ContainerStarted","Data":"9ea2dd060d2883217c2c5d0021a792f8a6c42184161b3013750fa6555bccf20a"} Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.830076 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-9k68t" event={"ID":"260602fc-bedf-40ec-92e7-a96e3ee009f0","Type":"ContainerStarted","Data":"20fe81435ff85cf55057098572ca1e83a7f6a261773daacd581a3ce69da7f0db"} Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.830087 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-9k68t" event={"ID":"260602fc-bedf-40ec-92e7-a96e3ee009f0","Type":"ContainerStarted","Data":"e4011c79670629671db14d08ef29671a026edb80bbdfefc3cd140d46a990977d"} Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.831212 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.832495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"2d7af252aa28015a3d07beee71f02ffa7d0579193d6d4d47b457d957f1cef987"} Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.833379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" event={"ID":"423d82c6-fd0b-4cb5-8ff2-501f479a9a73","Type":"ContainerStarted","Data":"3e0345cb1fb14937110e5376401037f2fa44b5fe3237178b48254e6e15d06525"} Mar 18 15:53:07 crc kubenswrapper[4792]: I0318 15:53:07.851772 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-9k68t" podStartSLOduration=1.8517515960000002 podStartE2EDuration="1.851751596s" podCreationTimestamp="2026-03-18 15:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:53:07.845909439 +0000 UTC m=+1136.715238386" watchObservedRunningTime="2026-03-18 15:53:07.851751596 +0000 UTC m=+1136.721080533" Mar 18 15:53:08 crc kubenswrapper[4792]: I0318 15:53:08.332899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:08 crc kubenswrapper[4792]: I0318 15:53:08.352784 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ac5ba665-4ead-4469-9d1c-c777bf26d579-memberlist\") pod \"speaker-kgc9b\" (UID: \"ac5ba665-4ead-4469-9d1c-c777bf26d579\") " pod="metallb-system/speaker-kgc9b" Mar 18 15:53:08 crc kubenswrapper[4792]: I0318 15:53:08.373261 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kgc9b" Mar 18 15:53:08 crc kubenswrapper[4792]: W0318 15:53:08.412255 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5ba665_4ead_4469_9d1c_c777bf26d579.slice/crio-c6b4a666935c9e7d0b930be9c3daa263978de07056b1aa72c810570d2e2ff751 WatchSource:0}: Error finding container c6b4a666935c9e7d0b930be9c3daa263978de07056b1aa72c810570d2e2ff751: Status 404 returned error can't find the container with id c6b4a666935c9e7d0b930be9c3daa263978de07056b1aa72c810570d2e2ff751 Mar 18 15:53:08 crc kubenswrapper[4792]: I0318 15:53:08.845218 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kgc9b" event={"ID":"ac5ba665-4ead-4469-9d1c-c777bf26d579","Type":"ContainerStarted","Data":"1ad8a712729e5bb7e27eb7ded07bdd83cf86a3b51c73f1dcbf4a7a05e8135b0f"} Mar 18 15:53:08 crc kubenswrapper[4792]: I0318 15:53:08.845269 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kgc9b" event={"ID":"ac5ba665-4ead-4469-9d1c-c777bf26d579","Type":"ContainerStarted","Data":"c6b4a666935c9e7d0b930be9c3daa263978de07056b1aa72c810570d2e2ff751"} Mar 18 15:53:09 crc kubenswrapper[4792]: I0318 15:53:09.865863 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kgc9b" event={"ID":"ac5ba665-4ead-4469-9d1c-c777bf26d579","Type":"ContainerStarted","Data":"c7d3b7dc5264fe58a68ce5a5f3a60642e84105e2bb4a53c1d19c057a3c2091d1"} Mar 18 15:53:09 crc kubenswrapper[4792]: I0318 15:53:09.888616 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kgc9b" podStartSLOduration=3.888597309 podStartE2EDuration="3.888597309s" podCreationTimestamp="2026-03-18 15:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:53:09.886633627 +0000 UTC m=+1138.755962584" watchObservedRunningTime="2026-03-18 15:53:09.888597309 +0000 UTC m=+1138.757926246" Mar 18 15:53:10 crc kubenswrapper[4792]: I0318 15:53:10.873063 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kgc9b" Mar 18 15:53:15 crc kubenswrapper[4792]: I0318 15:53:15.912255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"9ccd43a33663dddac2f84f10e3396d126fbfb99f07d28c5be9ab0ecdd7dedf1f"} Mar 18 15:53:16 crc kubenswrapper[4792]: I0318 15:53:16.919563 4792 generic.go:334] "Generic (PLEG): container finished" podID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerID="9ccd43a33663dddac2f84f10e3396d126fbfb99f07d28c5be9ab0ecdd7dedf1f" exitCode=0 Mar 18 15:53:16 crc kubenswrapper[4792]: I0318 15:53:16.919628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerDied","Data":"9ccd43a33663dddac2f84f10e3396d126fbfb99f07d28c5be9ab0ecdd7dedf1f"} Mar 18 15:53:16 crc kubenswrapper[4792]: I0318 15:53:16.921233 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" event={"ID":"423d82c6-fd0b-4cb5-8ff2-501f479a9a73","Type":"ContainerStarted","Data":"c5f68b83aa52d3dee8be75c0eecb0c3dce1400076d7ec501a6ae92e674089b1b"} Mar 18 15:53:16 crc kubenswrapper[4792]: I0318 15:53:16.921375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:16 crc kubenswrapper[4792]: I0318 15:53:16.960546 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" podStartSLOduration=3.047085645 podStartE2EDuration="10.960527703s" podCreationTimestamp="2026-03-18 15:53:06 +0000 UTC" firstStartedPulling="2026-03-18 15:53:07.794898585 +0000 UTC m=+1136.664227522" lastFinishedPulling="2026-03-18 15:53:15.708340643 +0000 UTC m=+1144.577669580" observedRunningTime="2026-03-18 15:53:16.960059248 +0000 UTC m=+1145.829388195" watchObservedRunningTime="2026-03-18 15:53:16.960527703 +0000 UTC m=+1145.829856640" Mar 18 15:53:17 crc kubenswrapper[4792]: I0318 15:53:17.929905 4792 generic.go:334] "Generic (PLEG): container finished" podID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerID="cd5b474aeef77b948cc6400eb0590484b1b92a7cb67024f40676f5ade2540c77" exitCode=0 Mar 18 15:53:17 crc kubenswrapper[4792]: I0318 15:53:17.929995 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerDied","Data":"cd5b474aeef77b948cc6400eb0590484b1b92a7cb67024f40676f5ade2540c77"} Mar 18 15:53:18 crc kubenswrapper[4792]: I0318 15:53:18.377658 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kgc9b" Mar 18 15:53:18 crc kubenswrapper[4792]: I0318 15:53:18.939886 4792 generic.go:334] "Generic (PLEG): container finished" podID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerID="a67f52ea33dd24848ccf5737a248959c93f849d5ac262060146c4bb7aa8c2ce9" exitCode=0 Mar 18 15:53:18 crc kubenswrapper[4792]: I0318 15:53:18.939932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerDied","Data":"a67f52ea33dd24848ccf5737a248959c93f849d5ac262060146c4bb7aa8c2ce9"} Mar 18 15:53:19 crc kubenswrapper[4792]: I0318 15:53:19.954079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"900bf463ee0fcaa0359a2f8c172d13c003803b065a7e7f1289f8d9059ce3244d"} Mar 18 15:53:20 crc kubenswrapper[4792]: I0318 15:53:20.964820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"f21ff65400c0d5dd3d0f9b2463d3f5ef9f8ac7c06c275c168d31f5efe546ec2c"} Mar 18 15:53:20 crc kubenswrapper[4792]: I0318 15:53:20.965347 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"85ffdf128e68acc5dd67ae80cfd444920ddeac8f419f1c0de4bdd4bf6daf98ec"} Mar 18 15:53:20 crc kubenswrapper[4792]: I0318 15:53:20.965622 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:20 crc kubenswrapper[4792]: I0318 15:53:20.965638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"c71519219e941b5ab37e80760227ff9ed5a2fd7d644681f43d2d60cedc80e793"} Mar 18 15:53:20 crc kubenswrapper[4792]: I0318 15:53:20.965650 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"ab1681b2f4f7a95e056b48e6e5161e0f5ab38dc838e66168b76fb39aa249d082"} Mar 18 15:53:20 crc kubenswrapper[4792]: I0318 15:53:20.965660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"c838efb357e9b230f9969499ec5989440f216e571e2fba7ccf75350f1f500ecb"} Mar 18 15:53:20 crc kubenswrapper[4792]: I0318 15:53:20.991354 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kvj8m" podStartSLOduration=6.294666025 podStartE2EDuration="14.991337209s" podCreationTimestamp="2026-03-18 15:53:06 +0000 UTC" firstStartedPulling="2026-03-18 15:53:06.969635383 +0000 UTC m=+1135.838964330" lastFinishedPulling="2026-03-18 15:53:15.666306577 +0000 UTC m=+1144.535635514" observedRunningTime="2026-03-18 15:53:20.984105388 +0000 UTC m=+1149.853434335" watchObservedRunningTime="2026-03-18 15:53:20.991337209 +0000 UTC m=+1149.860666146" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.045173 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9ksw4"] Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.046388 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9ksw4" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.052531 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hhc46" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.052782 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.053058 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.067553 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9ksw4"] Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.168446 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr4kq\" (UniqueName: \"kubernetes.io/projected/2e631a08-2290-4b09-bb7c-e45d16c4ec21-kube-api-access-sr4kq\") pod \"openstack-operator-index-9ksw4\" (UID: \"2e631a08-2290-4b09-bb7c-e45d16c4ec21\") " pod="openstack-operators/openstack-operator-index-9ksw4" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.269935 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr4kq\" (UniqueName: \"kubernetes.io/projected/2e631a08-2290-4b09-bb7c-e45d16c4ec21-kube-api-access-sr4kq\") pod \"openstack-operator-index-9ksw4\" (UID: \"2e631a08-2290-4b09-bb7c-e45d16c4ec21\") " pod="openstack-operators/openstack-operator-index-9ksw4" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.290675 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr4kq\" (UniqueName: \"kubernetes.io/projected/2e631a08-2290-4b09-bb7c-e45d16c4ec21-kube-api-access-sr4kq\") pod \"openstack-operator-index-9ksw4\" (UID: \"2e631a08-2290-4b09-bb7c-e45d16c4ec21\") " pod="openstack-operators/openstack-operator-index-9ksw4" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.369288 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9ksw4" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.793704 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.802434 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9ksw4"] Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.898587 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:21 crc kubenswrapper[4792]: I0318 15:53:21.972399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9ksw4" event={"ID":"2e631a08-2290-4b09-bb7c-e45d16c4ec21","Type":"ContainerStarted","Data":"42a5a76a412b12aa9e750055088041dd5884a247bca85476ea5fd215f1f877da"} Mar 18 15:53:23 crc kubenswrapper[4792]: I0318 15:53:23.827509 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9ksw4"] Mar 18 15:53:24 crc kubenswrapper[4792]: I0318 15:53:24.436925 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gkmkk"] Mar 18 15:53:24 crc kubenswrapper[4792]: I0318 15:53:24.439230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:24 crc kubenswrapper[4792]: I0318 15:53:24.443511 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gkmkk"] Mar 18 15:53:24 crc kubenswrapper[4792]: I0318 15:53:24.538608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w69mf\" (UniqueName: \"kubernetes.io/projected/cc92711e-be7a-4025-9077-cac9e5bc7df8-kube-api-access-w69mf\") pod \"openstack-operator-index-gkmkk\" (UID: \"cc92711e-be7a-4025-9077-cac9e5bc7df8\") " pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:24 crc kubenswrapper[4792]: I0318 15:53:24.641819 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w69mf\" (UniqueName: \"kubernetes.io/projected/cc92711e-be7a-4025-9077-cac9e5bc7df8-kube-api-access-w69mf\") pod \"openstack-operator-index-gkmkk\" (UID: \"cc92711e-be7a-4025-9077-cac9e5bc7df8\") " pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:24 crc kubenswrapper[4792]: I0318 15:53:24.668454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w69mf\" (UniqueName: \"kubernetes.io/projected/cc92711e-be7a-4025-9077-cac9e5bc7df8-kube-api-access-w69mf\") pod \"openstack-operator-index-gkmkk\" (UID: \"cc92711e-be7a-4025-9077-cac9e5bc7df8\") " pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:24 crc kubenswrapper[4792]: I0318 15:53:24.761207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:25 crc kubenswrapper[4792]: I0318 15:53:25.377264 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gkmkk"] Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.022563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gkmkk" event={"ID":"cc92711e-be7a-4025-9077-cac9e5bc7df8","Type":"ContainerStarted","Data":"352c9a6d9c3552b2bd4a638d84ccd64da9f8b9d243f6179dfcab7e0f1681b407"} Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.022923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gkmkk" event={"ID":"cc92711e-be7a-4025-9077-cac9e5bc7df8","Type":"ContainerStarted","Data":"28809813dcf7cad3795face0add538a7e02dcd0edccce6c05fe870436695bd07"} Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.026547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9ksw4" event={"ID":"2e631a08-2290-4b09-bb7c-e45d16c4ec21","Type":"ContainerStarted","Data":"1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0"} Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.026713 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9ksw4" podUID="2e631a08-2290-4b09-bb7c-e45d16c4ec21" containerName="registry-server" containerID="cri-o://1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0" gracePeriod=2 Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.040319 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gkmkk" podStartSLOduration=1.988794865 podStartE2EDuration="2.040279673s" podCreationTimestamp="2026-03-18 15:53:24 +0000 UTC" firstStartedPulling="2026-03-18 15:53:25.382929976 +0000 UTC m=+1154.252258913" lastFinishedPulling="2026-03-18 15:53:25.434414784 +0000 UTC m=+1154.303743721" observedRunningTime="2026-03-18 15:53:26.040248802 +0000 UTC m=+1154.909577749" watchObservedRunningTime="2026-03-18 15:53:26.040279673 +0000 UTC m=+1154.909608610" Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.065612 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9ksw4" podStartSLOduration=1.898487982 podStartE2EDuration="5.065587193s" podCreationTimestamp="2026-03-18 15:53:21 +0000 UTC" firstStartedPulling="2026-03-18 15:53:21.800208908 +0000 UTC m=+1150.669537845" lastFinishedPulling="2026-03-18 15:53:24.967308119 +0000 UTC m=+1153.836637056" observedRunningTime="2026-03-18 15:53:26.059153417 +0000 UTC m=+1154.928482354" watchObservedRunningTime="2026-03-18 15:53:26.065587193 +0000 UTC m=+1154.934916150" Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.513680 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9ksw4" Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.577329 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr4kq\" (UniqueName: \"kubernetes.io/projected/2e631a08-2290-4b09-bb7c-e45d16c4ec21-kube-api-access-sr4kq\") pod \"2e631a08-2290-4b09-bb7c-e45d16c4ec21\" (UID: \"2e631a08-2290-4b09-bb7c-e45d16c4ec21\") " Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.582610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e631a08-2290-4b09-bb7c-e45d16c4ec21-kube-api-access-sr4kq" (OuterVolumeSpecName: "kube-api-access-sr4kq") pod "2e631a08-2290-4b09-bb7c-e45d16c4ec21" (UID: "2e631a08-2290-4b09-bb7c-e45d16c4ec21"). InnerVolumeSpecName "kube-api-access-sr4kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.679621 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr4kq\" (UniqueName: \"kubernetes.io/projected/2e631a08-2290-4b09-bb7c-e45d16c4ec21-kube-api-access-sr4kq\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:26 crc kubenswrapper[4792]: I0318 15:53:26.902935 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-9k68t" Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.035856 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e631a08-2290-4b09-bb7c-e45d16c4ec21" containerID="1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0" exitCode=0 Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.035897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9ksw4" event={"ID":"2e631a08-2290-4b09-bb7c-e45d16c4ec21","Type":"ContainerDied","Data":"1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0"} Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.035934 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9ksw4" event={"ID":"2e631a08-2290-4b09-bb7c-e45d16c4ec21","Type":"ContainerDied","Data":"42a5a76a412b12aa9e750055088041dd5884a247bca85476ea5fd215f1f877da"} Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.035878 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9ksw4" Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.035955 4792 scope.go:117] "RemoveContainer" containerID="1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0" Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.058891 4792 scope.go:117] "RemoveContainer" containerID="1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0" Mar 18 15:53:27 crc kubenswrapper[4792]: E0318 15:53:27.060503 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0\": container with ID starting with 1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0 not found: ID does not exist" containerID="1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0" Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.060583 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0"} err="failed to get container status \"1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0\": rpc error: code = NotFound desc = could not find container \"1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0\": container with ID starting with 1cff87f808f8803ff5a368fc8a61cc01cabb09dee1fe3f71c950040446ca63e0 not found: ID does not exist" Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.072632 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9ksw4"] Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.079629 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9ksw4"] Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.377476 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" Mar 18 15:53:27 crc kubenswrapper[4792]: I0318 15:53:27.864674 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e631a08-2290-4b09-bb7c-e45d16c4ec21" path="/var/lib/kubelet/pods/2e631a08-2290-4b09-bb7c-e45d16c4ec21/volumes" Mar 18 15:53:30 crc kubenswrapper[4792]: I0318 15:53:30.322562 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:53:30 crc kubenswrapper[4792]: I0318 15:53:30.323891 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:53:30 crc kubenswrapper[4792]: I0318 15:53:30.324004 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:53:30 crc kubenswrapper[4792]: I0318 15:53:30.325105 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2f3b1d9e5efb71a659892b0519133711d0d4a704a137b617addb0c6d53c19c2"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:53:30 crc kubenswrapper[4792]: I0318 15:53:30.325192 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://f2f3b1d9e5efb71a659892b0519133711d0d4a704a137b617addb0c6d53c19c2" gracePeriod=600 Mar 18 15:53:31 crc kubenswrapper[4792]: I0318 15:53:31.072374 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="f2f3b1d9e5efb71a659892b0519133711d0d4a704a137b617addb0c6d53c19c2" exitCode=0 Mar 18 15:53:31 crc kubenswrapper[4792]: I0318 15:53:31.072446 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"f2f3b1d9e5efb71a659892b0519133711d0d4a704a137b617addb0c6d53c19c2"} Mar 18 15:53:31 crc kubenswrapper[4792]: I0318 15:53:31.072647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"a93068a48274d54a195ba2b1867063b29cb44c6a806452b2de108e9e08cab78f"} Mar 18 15:53:31 crc kubenswrapper[4792]: I0318 15:53:31.072676 4792 scope.go:117] "RemoveContainer" containerID="f33d47ada8d06b2ac36e49a83c544decfab94cfff67a570889a75d2335bcd957" Mar 18 15:53:34 crc kubenswrapper[4792]: I0318 15:53:34.761611 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:34 crc kubenswrapper[4792]: I0318 15:53:34.763082 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:34 crc kubenswrapper[4792]: I0318 15:53:34.806809 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:35 crc kubenswrapper[4792]: I0318 15:53:35.135130 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-gkmkk" Mar 18 15:53:36 crc kubenswrapper[4792]: I0318 15:53:36.797066 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kvj8m" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.476763 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj"] Mar 18 15:53:48 crc kubenswrapper[4792]: E0318 15:53:48.478470 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e631a08-2290-4b09-bb7c-e45d16c4ec21" containerName="registry-server" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.478571 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e631a08-2290-4b09-bb7c-e45d16c4ec21" containerName="registry-server" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.478866 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e631a08-2290-4b09-bb7c-e45d16c4ec21" containerName="registry-server" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.480335 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.482943 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-p9lwl" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.486664 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj"] Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.563720 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-util\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.563769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86r8x\" (UniqueName: \"kubernetes.io/projected/f550007c-836c-4db3-b4cc-4ad8ffca5264-kube-api-access-86r8x\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.563831 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-bundle\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.665638 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-bundle\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.665842 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-util\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.665897 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86r8x\" (UniqueName: \"kubernetes.io/projected/f550007c-836c-4db3-b4cc-4ad8ffca5264-kube-api-access-86r8x\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.667033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-util\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.667046 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-bundle\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.699827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86r8x\" (UniqueName: \"kubernetes.io/projected/f550007c-836c-4db3-b4cc-4ad8ffca5264-kube-api-access-86r8x\") pod \"9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:48 crc kubenswrapper[4792]: I0318 15:53:48.799310 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:49 crc kubenswrapper[4792]: I0318 15:53:49.276995 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj"] Mar 18 15:53:49 crc kubenswrapper[4792]: W0318 15:53:49.285172 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf550007c_836c_4db3_b4cc_4ad8ffca5264.slice/crio-0b7e0d844d47d16f2dd89f20a957a4d25ef4630c91f79a8762558499b271649a WatchSource:0}: Error finding container 0b7e0d844d47d16f2dd89f20a957a4d25ef4630c91f79a8762558499b271649a: Status 404 returned error can't find the container with id 0b7e0d844d47d16f2dd89f20a957a4d25ef4630c91f79a8762558499b271649a Mar 18 15:53:50 crc kubenswrapper[4792]: I0318 15:53:50.226320 4792 generic.go:334] "Generic (PLEG): container finished" podID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerID="5857574fa7e0bfbc84a1dd61fd73913b1c19d92dfd2caa983058a5b1c78390c4" exitCode=0 Mar 18 15:53:50 crc kubenswrapper[4792]: I0318 15:53:50.227498 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" event={"ID":"f550007c-836c-4db3-b4cc-4ad8ffca5264","Type":"ContainerDied","Data":"5857574fa7e0bfbc84a1dd61fd73913b1c19d92dfd2caa983058a5b1c78390c4"} Mar 18 15:53:50 crc kubenswrapper[4792]: I0318 15:53:50.227636 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" event={"ID":"f550007c-836c-4db3-b4cc-4ad8ffca5264","Type":"ContainerStarted","Data":"0b7e0d844d47d16f2dd89f20a957a4d25ef4630c91f79a8762558499b271649a"} Mar 18 15:53:51 crc kubenswrapper[4792]: I0318 15:53:51.240718 4792 generic.go:334] "Generic (PLEG): container finished" podID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerID="f61fa71d02c1ee3050ad8ef5208b34edffe473538efc2f0f7c8d252b4fe2650f" exitCode=0 Mar 18 15:53:51 crc kubenswrapper[4792]: I0318 15:53:51.240778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" event={"ID":"f550007c-836c-4db3-b4cc-4ad8ffca5264","Type":"ContainerDied","Data":"f61fa71d02c1ee3050ad8ef5208b34edffe473538efc2f0f7c8d252b4fe2650f"} Mar 18 15:53:52 crc kubenswrapper[4792]: I0318 15:53:52.253233 4792 generic.go:334] "Generic (PLEG): container finished" podID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerID="fbe88da3faa2a4cd3f349d180820b241b6c453b1367c4e752b76be4c56cb115b" exitCode=0 Mar 18 15:53:52 crc kubenswrapper[4792]: I0318 15:53:52.253282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" event={"ID":"f550007c-836c-4db3-b4cc-4ad8ffca5264","Type":"ContainerDied","Data":"fbe88da3faa2a4cd3f349d180820b241b6c453b1367c4e752b76be4c56cb115b"} Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.529777 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.647710 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-bundle\") pod \"f550007c-836c-4db3-b4cc-4ad8ffca5264\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.647788 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86r8x\" (UniqueName: \"kubernetes.io/projected/f550007c-836c-4db3-b4cc-4ad8ffca5264-kube-api-access-86r8x\") pod \"f550007c-836c-4db3-b4cc-4ad8ffca5264\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.647882 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-util\") pod \"f550007c-836c-4db3-b4cc-4ad8ffca5264\" (UID: \"f550007c-836c-4db3-b4cc-4ad8ffca5264\") " Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.648523 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-bundle" (OuterVolumeSpecName: "bundle") pod "f550007c-836c-4db3-b4cc-4ad8ffca5264" (UID: "f550007c-836c-4db3-b4cc-4ad8ffca5264"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.653808 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f550007c-836c-4db3-b4cc-4ad8ffca5264-kube-api-access-86r8x" (OuterVolumeSpecName: "kube-api-access-86r8x") pod "f550007c-836c-4db3-b4cc-4ad8ffca5264" (UID: "f550007c-836c-4db3-b4cc-4ad8ffca5264"). InnerVolumeSpecName "kube-api-access-86r8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.661263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-util" (OuterVolumeSpecName: "util") pod "f550007c-836c-4db3-b4cc-4ad8ffca5264" (UID: "f550007c-836c-4db3-b4cc-4ad8ffca5264"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.750424 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.750467 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86r8x\" (UniqueName: \"kubernetes.io/projected/f550007c-836c-4db3-b4cc-4ad8ffca5264-kube-api-access-86r8x\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:53 crc kubenswrapper[4792]: I0318 15:53:53.750486 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f550007c-836c-4db3-b4cc-4ad8ffca5264-util\") on node \"crc\" DevicePath \"\"" Mar 18 15:53:54 crc kubenswrapper[4792]: I0318 15:53:54.270472 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" event={"ID":"f550007c-836c-4db3-b4cc-4ad8ffca5264","Type":"ContainerDied","Data":"0b7e0d844d47d16f2dd89f20a957a4d25ef4630c91f79a8762558499b271649a"} Mar 18 15:53:54 crc kubenswrapper[4792]: I0318 15:53:54.270548 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj" Mar 18 15:53:54 crc kubenswrapper[4792]: I0318 15:53:54.270568 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b7e0d844d47d16f2dd89f20a957a4d25ef4630c91f79a8762558499b271649a" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.834920 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm"] Mar 18 15:53:59 crc kubenswrapper[4792]: E0318 15:53:59.835821 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerName="extract" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.835840 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerName="extract" Mar 18 15:53:59 crc kubenswrapper[4792]: E0318 15:53:59.835861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerName="pull" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.835867 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerName="pull" Mar 18 15:53:59 crc kubenswrapper[4792]: E0318 15:53:59.835884 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerName="util" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.835890 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerName="util" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.836054 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f550007c-836c-4db3-b4cc-4ad8ffca5264" containerName="extract" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.836732 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.839149 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-2m4dt" Mar 18 15:53:59 crc kubenswrapper[4792]: I0318 15:53:59.960341 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzhxm\" (UniqueName: \"kubernetes.io/projected/407238a6-a2e5-420c-801b-8a4329eebadd-kube-api-access-dzhxm\") pod \"openstack-operator-controller-init-7f795bfd45-wf9cm\" (UID: \"407238a6-a2e5-420c-801b-8a4329eebadd\") " pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.000337 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm"] Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.062788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzhxm\" (UniqueName: \"kubernetes.io/projected/407238a6-a2e5-420c-801b-8a4329eebadd-kube-api-access-dzhxm\") pod \"openstack-operator-controller-init-7f795bfd45-wf9cm\" (UID: \"407238a6-a2e5-420c-801b-8a4329eebadd\") " pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.089447 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzhxm\" (UniqueName: \"kubernetes.io/projected/407238a6-a2e5-420c-801b-8a4329eebadd-kube-api-access-dzhxm\") pod \"openstack-operator-controller-init-7f795bfd45-wf9cm\" (UID: \"407238a6-a2e5-420c-801b-8a4329eebadd\") " pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.155383 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.195859 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564154-zvm46"] Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.199003 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-zvm46" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.202404 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.202913 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.203138 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.207130 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-zvm46"] Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.268593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdv5\" (UniqueName: \"kubernetes.io/projected/2520256f-dcf3-41f8-9dcf-73e8963a5ae0-kube-api-access-qjdv5\") pod \"auto-csr-approver-29564154-zvm46\" (UID: \"2520256f-dcf3-41f8-9dcf-73e8963a5ae0\") " pod="openshift-infra/auto-csr-approver-29564154-zvm46" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.370342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdv5\" (UniqueName: \"kubernetes.io/projected/2520256f-dcf3-41f8-9dcf-73e8963a5ae0-kube-api-access-qjdv5\") pod \"auto-csr-approver-29564154-zvm46\" (UID: \"2520256f-dcf3-41f8-9dcf-73e8963a5ae0\") " pod="openshift-infra/auto-csr-approver-29564154-zvm46" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.401258 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdv5\" (UniqueName: \"kubernetes.io/projected/2520256f-dcf3-41f8-9dcf-73e8963a5ae0-kube-api-access-qjdv5\") pod \"auto-csr-approver-29564154-zvm46\" (UID: \"2520256f-dcf3-41f8-9dcf-73e8963a5ae0\") " pod="openshift-infra/auto-csr-approver-29564154-zvm46" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.571249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-zvm46" Mar 18 15:54:00 crc kubenswrapper[4792]: I0318 15:54:00.641723 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm"] Mar 18 15:54:01 crc kubenswrapper[4792]: I0318 15:54:01.032071 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-zvm46"] Mar 18 15:54:01 crc kubenswrapper[4792]: W0318 15:54:01.044252 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2520256f_dcf3_41f8_9dcf_73e8963a5ae0.slice/crio-a7eee13f66090d782c524ea55709cf51973ec4d7cc02ee8291716bfe52ba2437 WatchSource:0}: Error finding container a7eee13f66090d782c524ea55709cf51973ec4d7cc02ee8291716bfe52ba2437: Status 404 returned error can't find the container with id a7eee13f66090d782c524ea55709cf51973ec4d7cc02ee8291716bfe52ba2437 Mar 18 15:54:01 crc kubenswrapper[4792]: I0318 15:54:01.326298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" event={"ID":"407238a6-a2e5-420c-801b-8a4329eebadd","Type":"ContainerStarted","Data":"7654d7c41c7a1a565dee3e60290d9ceabf58eb630c56a95f6becbd79076f25cb"} Mar 18 15:54:01 crc kubenswrapper[4792]: I0318 15:54:01.329120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-zvm46" event={"ID":"2520256f-dcf3-41f8-9dcf-73e8963a5ae0","Type":"ContainerStarted","Data":"a7eee13f66090d782c524ea55709cf51973ec4d7cc02ee8291716bfe52ba2437"} Mar 18 15:54:05 crc kubenswrapper[4792]: I0318 15:54:05.361126 4792 generic.go:334] "Generic (PLEG): container finished" podID="2520256f-dcf3-41f8-9dcf-73e8963a5ae0" containerID="6dffd13a15c195a01c61ba4fddecd3d3d2c2f87ae6f586556c6a60b307ffc030" exitCode=0 Mar 18 15:54:05 crc kubenswrapper[4792]: I0318 15:54:05.361183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-zvm46" event={"ID":"2520256f-dcf3-41f8-9dcf-73e8963a5ae0","Type":"ContainerDied","Data":"6dffd13a15c195a01c61ba4fddecd3d3d2c2f87ae6f586556c6a60b307ffc030"} Mar 18 15:54:05 crc kubenswrapper[4792]: I0318 15:54:05.363739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" event={"ID":"407238a6-a2e5-420c-801b-8a4329eebadd","Type":"ContainerStarted","Data":"3f3f613e3f0194ee58a35a3ad2dc27d0c4201f624e78dfb51fa750c0acd7c1ac"} Mar 18 15:54:05 crc kubenswrapper[4792]: I0318 15:54:05.363901 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" Mar 18 15:54:05 crc kubenswrapper[4792]: I0318 15:54:05.399790 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" podStartSLOduration=2.6450094589999997 podStartE2EDuration="6.399767313s" podCreationTimestamp="2026-03-18 15:53:59 +0000 UTC" firstStartedPulling="2026-03-18 15:54:00.65635373 +0000 UTC m=+1189.525682667" lastFinishedPulling="2026-03-18 15:54:04.411111584 +0000 UTC m=+1193.280440521" observedRunningTime="2026-03-18 15:54:05.396819556 +0000 UTC m=+1194.266148493" watchObservedRunningTime="2026-03-18 15:54:05.399767313 +0000 UTC m=+1194.269096250" Mar 18 15:54:06 crc kubenswrapper[4792]: I0318 15:54:06.705951 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-zvm46" Mar 18 15:54:06 crc kubenswrapper[4792]: I0318 15:54:06.792694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjdv5\" (UniqueName: \"kubernetes.io/projected/2520256f-dcf3-41f8-9dcf-73e8963a5ae0-kube-api-access-qjdv5\") pod \"2520256f-dcf3-41f8-9dcf-73e8963a5ae0\" (UID: \"2520256f-dcf3-41f8-9dcf-73e8963a5ae0\") " Mar 18 15:54:06 crc kubenswrapper[4792]: I0318 15:54:06.797592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2520256f-dcf3-41f8-9dcf-73e8963a5ae0-kube-api-access-qjdv5" (OuterVolumeSpecName: "kube-api-access-qjdv5") pod "2520256f-dcf3-41f8-9dcf-73e8963a5ae0" (UID: "2520256f-dcf3-41f8-9dcf-73e8963a5ae0"). InnerVolumeSpecName "kube-api-access-qjdv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:54:06 crc kubenswrapper[4792]: I0318 15:54:06.895759 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjdv5\" (UniqueName: \"kubernetes.io/projected/2520256f-dcf3-41f8-9dcf-73e8963a5ae0-kube-api-access-qjdv5\") on node \"crc\" DevicePath \"\"" Mar 18 15:54:07 crc kubenswrapper[4792]: I0318 15:54:07.381880 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564154-zvm46" event={"ID":"2520256f-dcf3-41f8-9dcf-73e8963a5ae0","Type":"ContainerDied","Data":"a7eee13f66090d782c524ea55709cf51973ec4d7cc02ee8291716bfe52ba2437"} Mar 18 15:54:07 crc kubenswrapper[4792]: I0318 15:54:07.381932 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7eee13f66090d782c524ea55709cf51973ec4d7cc02ee8291716bfe52ba2437" Mar 18 15:54:07 crc kubenswrapper[4792]: I0318 15:54:07.381933 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564154-zvm46" Mar 18 15:54:07 crc kubenswrapper[4792]: I0318 15:54:07.791658 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-j8krc"] Mar 18 15:54:07 crc kubenswrapper[4792]: I0318 15:54:07.799590 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564148-j8krc"] Mar 18 15:54:07 crc kubenswrapper[4792]: I0318 15:54:07.862839 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d86a35-8172-4d88-bcd3-06612a005ddc" path="/var/lib/kubelet/pods/d4d86a35-8172-4d88-bcd3-06612a005ddc/volumes" Mar 18 15:54:10 crc kubenswrapper[4792]: I0318 15:54:10.158531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" Mar 18 15:54:26 crc kubenswrapper[4792]: I0318 15:54:26.405921 4792 scope.go:117] "RemoveContainer" containerID="b02b657f2dc9309556d5da3fb8f3ac436e35756ad8ecdba90343ec70151242c2" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.630157 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q"] Mar 18 15:54:42 crc kubenswrapper[4792]: E0318 15:54:42.631271 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2520256f-dcf3-41f8-9dcf-73e8963a5ae0" containerName="oc" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.631292 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2520256f-dcf3-41f8-9dcf-73e8963a5ae0" containerName="oc" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.631496 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2520256f-dcf3-41f8-9dcf-73e8963a5ae0" containerName="oc" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.632199 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.635326 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xtjlk" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.639398 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.640847 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.642891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-cm5jx" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.647149 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.672086 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.686768 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.688863 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.693253 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-dsg87" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.719082 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.748040 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.749117 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.765937 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9489z" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.771088 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.817940 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.819143 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.826305 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdr7s\" (UniqueName: \"kubernetes.io/projected/aa2e6c5a-c94a-482a-aceb-156b1cc316d0-kube-api-access-qdr7s\") pod \"glance-operator-controller-manager-7d559dcdbd-jtksk\" (UID: \"aa2e6c5a-c94a-482a-aceb-156b1cc316d0\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.826383 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmtgd\" (UniqueName: \"kubernetes.io/projected/af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9-kube-api-access-jmtgd\") pod \"heat-operator-controller-manager-66dd9d474d-nvs5w\" (UID: \"af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.826419 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqwg7\" (UniqueName: \"kubernetes.io/projected/6ccc988b-8909-4e90-b016-c94a1deb2de7-kube-api-access-qqwg7\") pod \"designate-operator-controller-manager-6cc65c69fc-tls5q\" (UID: \"6ccc988b-8909-4e90-b016-c94a1deb2de7\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.826457 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqz2v\" (UniqueName: \"kubernetes.io/projected/3692a84a-23dc-4b6c-9c20-d97bd0e285d8-kube-api-access-vqz2v\") pod \"cinder-operator-controller-manager-6d77645966-xzpbw\" (UID: \"3692a84a-23dc-4b6c-9c20-d97bd0e285d8\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.826491 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5mhq\" (UniqueName: \"kubernetes.io/projected/79896742-17fd-4960-ae5b-af3c83550a4e-kube-api-access-g5mhq\") pod \"barbican-operator-controller-manager-5cfd84c587-rpk5q\" (UID: \"79896742-17fd-4960-ae5b-af3c83550a4e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.826954 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-x7h87" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.831759 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.834718 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.845881 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4ns98" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.866394 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.867706 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.874020 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-fd5p2" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.874355 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.888368 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.908478 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.909573 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.916668 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vhznz" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.927872 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdr7s\" (UniqueName: \"kubernetes.io/projected/aa2e6c5a-c94a-482a-aceb-156b1cc316d0-kube-api-access-qdr7s\") pod \"glance-operator-controller-manager-7d559dcdbd-jtksk\" (UID: \"aa2e6c5a-c94a-482a-aceb-156b1cc316d0\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.927958 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmtgd\" (UniqueName: \"kubernetes.io/projected/af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9-kube-api-access-jmtgd\") pod \"heat-operator-controller-manager-66dd9d474d-nvs5w\" (UID: \"af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.928031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqwg7\" (UniqueName: \"kubernetes.io/projected/6ccc988b-8909-4e90-b016-c94a1deb2de7-kube-api-access-qqwg7\") pod \"designate-operator-controller-manager-6cc65c69fc-tls5q\" (UID: \"6ccc988b-8909-4e90-b016-c94a1deb2de7\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.928076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqz2v\" (UniqueName: \"kubernetes.io/projected/3692a84a-23dc-4b6c-9c20-d97bd0e285d8-kube-api-access-vqz2v\") pod \"cinder-operator-controller-manager-6d77645966-xzpbw\" (UID: \"3692a84a-23dc-4b6c-9c20-d97bd0e285d8\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.928113 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5mhq\" (UniqueName: \"kubernetes.io/projected/79896742-17fd-4960-ae5b-af3c83550a4e-kube-api-access-g5mhq\") pod \"barbican-operator-controller-manager-5cfd84c587-rpk5q\" (UID: \"79896742-17fd-4960-ae5b-af3c83550a4e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.932024 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.951023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.973426 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn"] Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.973833 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqz2v\" (UniqueName: \"kubernetes.io/projected/3692a84a-23dc-4b6c-9c20-d97bd0e285d8-kube-api-access-vqz2v\") pod \"cinder-operator-controller-manager-6d77645966-xzpbw\" (UID: \"3692a84a-23dc-4b6c-9c20-d97bd0e285d8\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.980740 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5mhq\" (UniqueName: \"kubernetes.io/projected/79896742-17fd-4960-ae5b-af3c83550a4e-kube-api-access-g5mhq\") pod \"barbican-operator-controller-manager-5cfd84c587-rpk5q\" (UID: \"79896742-17fd-4960-ae5b-af3c83550a4e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.982888 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmtgd\" (UniqueName: \"kubernetes.io/projected/af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9-kube-api-access-jmtgd\") pod \"heat-operator-controller-manager-66dd9d474d-nvs5w\" (UID: \"af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.983178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdr7s\" (UniqueName: \"kubernetes.io/projected/aa2e6c5a-c94a-482a-aceb-156b1cc316d0-kube-api-access-qdr7s\") pod \"glance-operator-controller-manager-7d559dcdbd-jtksk\" (UID: \"aa2e6c5a-c94a-482a-aceb-156b1cc316d0\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.989754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqwg7\" (UniqueName: \"kubernetes.io/projected/6ccc988b-8909-4e90-b016-c94a1deb2de7-kube-api-access-qqwg7\") pod \"designate-operator-controller-manager-6cc65c69fc-tls5q\" (UID: \"6ccc988b-8909-4e90-b016-c94a1deb2de7\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" Mar 18 15:54:42 crc kubenswrapper[4792]: I0318 15:54:42.997197 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.006203 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qwpfx" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.011290 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.021539 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.030598 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4r72\" (UniqueName: \"kubernetes.io/projected/55d5f156-656e-4e2f-b368-e841124084d1-kube-api-access-j4r72\") pod \"horizon-operator-controller-manager-64dc66d669-26gf6\" (UID: \"55d5f156-656e-4e2f-b368-e841124084d1\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.030759 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.030855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbfdr\" (UniqueName: \"kubernetes.io/projected/d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512-kube-api-access-dbfdr\") pod \"ironic-operator-controller-manager-6b77b7676d-gc45m\" (UID: \"d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.030893 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4frz\" (UniqueName: \"kubernetes.io/projected/155eb4c3-aa63-4ec7-9824-1bef2045a68b-kube-api-access-k4frz\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.044803 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.046125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.049747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vvtxv" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.072382 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.080990 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.086040 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.087102 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.090464 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-7c96t" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.097005 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.134284 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4r72\" (UniqueName: \"kubernetes.io/projected/55d5f156-656e-4e2f-b368-e841124084d1-kube-api-access-j4r72\") pod \"horizon-operator-controller-manager-64dc66d669-26gf6\" (UID: \"55d5f156-656e-4e2f-b368-e841124084d1\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.134683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.134792 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbfdr\" (UniqueName: \"kubernetes.io/projected/d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512-kube-api-access-dbfdr\") pod \"ironic-operator-controller-manager-6b77b7676d-gc45m\" (UID: \"d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.134815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4frz\" (UniqueName: \"kubernetes.io/projected/155eb4c3-aa63-4ec7-9824-1bef2045a68b-kube-api-access-k4frz\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.134852 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brnzj\" (UniqueName: \"kubernetes.io/projected/dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b-kube-api-access-brnzj\") pod \"keystone-operator-controller-manager-76b87776c9-mf8vn\" (UID: \"dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" Mar 18 15:54:43 crc kubenswrapper[4792]: E0318 15:54:43.136408 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:43 crc kubenswrapper[4792]: E0318 15:54:43.136478 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert podName:155eb4c3-aa63-4ec7-9824-1bef2045a68b nodeName:}" failed. No retries permitted until 2026-03-18 15:54:43.636444817 +0000 UTC m=+1232.505773754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert") pod "infra-operator-controller-manager-5595c7d6ff-chpgl" (UID: "155eb4c3-aa63-4ec7-9824-1bef2045a68b") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.166638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbfdr\" (UniqueName: \"kubernetes.io/projected/d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512-kube-api-access-dbfdr\") pod \"ironic-operator-controller-manager-6b77b7676d-gc45m\" (UID: \"d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.167166 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.167690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.200195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4r72\" (UniqueName: \"kubernetes.io/projected/55d5f156-656e-4e2f-b368-e841124084d1-kube-api-access-j4r72\") pod \"horizon-operator-controller-manager-64dc66d669-26gf6\" (UID: \"55d5f156-656e-4e2f-b368-e841124084d1\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.204373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.206792 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4frz\" (UniqueName: \"kubernetes.io/projected/155eb4c3-aa63-4ec7-9824-1bef2045a68b-kube-api-access-k4frz\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.223106 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.224364 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.227326 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tzfmp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.234224 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.235359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.236664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntkcw\" (UniqueName: \"kubernetes.io/projected/65722e7d-1557-437c-ae5c-383082933c8c-kube-api-access-ntkcw\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-n4j9l\" (UID: \"65722e7d-1557-437c-ae5c-383082933c8c\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.236768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brnzj\" (UniqueName: \"kubernetes.io/projected/dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b-kube-api-access-brnzj\") pod \"keystone-operator-controller-manager-76b87776c9-mf8vn\" (UID: \"dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.236824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jgm\" (UniqueName: \"kubernetes.io/projected/dd73a890-f234-415f-b99a-685059be7d48-kube-api-access-92jgm\") pod \"manila-operator-controller-manager-fbf7bbb96-kmpvr\" (UID: \"dd73a890-f234-415f-b99a-685059be7d48\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.241627 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.242271 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-l96qf" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.255588 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.267960 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.268371 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.271602 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brnzj\" (UniqueName: \"kubernetes.io/projected/dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b-kube-api-access-brnzj\") pod \"keystone-operator-controller-manager-76b87776c9-mf8vn\" (UID: \"dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.320435 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.338054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jgm\" (UniqueName: \"kubernetes.io/projected/dd73a890-f234-415f-b99a-685059be7d48-kube-api-access-92jgm\") pod \"manila-operator-controller-manager-fbf7bbb96-kmpvr\" (UID: \"dd73a890-f234-415f-b99a-685059be7d48\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.338126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntkcw\" (UniqueName: \"kubernetes.io/projected/65722e7d-1557-437c-ae5c-383082933c8c-kube-api-access-ntkcw\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-n4j9l\" (UID: \"65722e7d-1557-437c-ae5c-383082933c8c\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.338209 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdn8\" (UniqueName: \"kubernetes.io/projected/a518542e-e1c4-4754-9031-d3f1571abb27-kube-api-access-4cdn8\") pod \"neutron-operator-controller-manager-6744dd545c-77v8x\" (UID: \"a518542e-e1c4-4754-9031-d3f1571abb27\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.338255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnxzx\" (UniqueName: \"kubernetes.io/projected/abc215c2-57eb-4c7a-b19d-0ed3ccd67001-kube-api-access-cnxzx\") pod \"nova-operator-controller-manager-bc5c78db9-hpr6x\" (UID: \"abc215c2-57eb-4c7a-b19d-0ed3ccd67001\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.373953 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.376915 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.378243 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.381551 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wcn4c" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.380959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jgm\" (UniqueName: \"kubernetes.io/projected/dd73a890-f234-415f-b99a-685059be7d48-kube-api-access-92jgm\") pod \"manila-operator-controller-manager-fbf7bbb96-kmpvr\" (UID: \"dd73a890-f234-415f-b99a-685059be7d48\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.409531 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntkcw\" (UniqueName: \"kubernetes.io/projected/65722e7d-1557-437c-ae5c-383082933c8c-kube-api-access-ntkcw\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-n4j9l\" (UID: \"65722e7d-1557-437c-ae5c-383082933c8c\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.435052 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.442947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdn8\" (UniqueName: \"kubernetes.io/projected/a518542e-e1c4-4754-9031-d3f1571abb27-kube-api-access-4cdn8\") pod \"neutron-operator-controller-manager-6744dd545c-77v8x\" (UID: \"a518542e-e1c4-4754-9031-d3f1571abb27\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.443119 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnxzx\" (UniqueName: \"kubernetes.io/projected/abc215c2-57eb-4c7a-b19d-0ed3ccd67001-kube-api-access-cnxzx\") pod \"nova-operator-controller-manager-bc5c78db9-hpr6x\" (UID: \"abc215c2-57eb-4c7a-b19d-0ed3ccd67001\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.449875 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.451021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.453642 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-phtlx" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.474084 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnxzx\" (UniqueName: \"kubernetes.io/projected/abc215c2-57eb-4c7a-b19d-0ed3ccd67001-kube-api-access-cnxzx\") pod \"nova-operator-controller-manager-bc5c78db9-hpr6x\" (UID: \"abc215c2-57eb-4c7a-b19d-0ed3ccd67001\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.476236 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.479852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdn8\" (UniqueName: \"kubernetes.io/projected/a518542e-e1c4-4754-9031-d3f1571abb27-kube-api-access-4cdn8\") pod \"neutron-operator-controller-manager-6744dd545c-77v8x\" (UID: \"a518542e-e1c4-4754-9031-d3f1571abb27\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.483628 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.491652 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2scr9" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.491831 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.500016 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.511944 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.521694 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.548404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92glk\" (UniqueName: \"kubernetes.io/projected/05dba0ab-e659-4e0c-8713-4eebeca6edba-kube-api-access-92glk\") pod \"octavia-operator-controller-manager-56f74467c6-5c65h\" (UID: \"05dba0ab-e659-4e0c-8713-4eebeca6edba\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.573681 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.575265 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.583710 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.590805 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tvxqg" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.598123 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.607659 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.622205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.652678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.652894 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dz5w\" (UniqueName: \"kubernetes.io/projected/8cf0ba21-2c05-4e3d-8925-114487cc4998-kube-api-access-2dz5w\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.652926 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.653030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxc5h\" (UniqueName: \"kubernetes.io/projected/a1327184-da65-478d-b7a7-15d0daa3ca95-kube-api-access-fxc5h\") pod \"ovn-operator-controller-manager-846c4cdcb7-l27l2\" (UID: \"a1327184-da65-478d-b7a7-15d0daa3ca95\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.653079 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92glk\" (UniqueName: \"kubernetes.io/projected/05dba0ab-e659-4e0c-8713-4eebeca6edba-kube-api-access-92glk\") pod \"octavia-operator-controller-manager-56f74467c6-5c65h\" (UID: \"05dba0ab-e659-4e0c-8713-4eebeca6edba\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 15:54:43 crc kubenswrapper[4792]: E0318 15:54:43.653596 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:43 crc kubenswrapper[4792]: E0318 15:54:43.653640 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert podName:155eb4c3-aa63-4ec7-9824-1bef2045a68b nodeName:}" failed. No retries permitted until 2026-03-18 15:54:44.653624709 +0000 UTC m=+1233.522953646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert") pod "infra-operator-controller-manager-5595c7d6ff-chpgl" (UID: "155eb4c3-aa63-4ec7-9824-1bef2045a68b") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.655081 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.656507 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.672034 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xrmvv" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.674416 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.709772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92glk\" (UniqueName: \"kubernetes.io/projected/05dba0ab-e659-4e0c-8713-4eebeca6edba-kube-api-access-92glk\") pod \"octavia-operator-controller-manager-56f74467c6-5c65h\" (UID: \"05dba0ab-e659-4e0c-8713-4eebeca6edba\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.712229 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.713556 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.718398 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4fbwx" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.719788 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.754863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9c94\" (UniqueName: \"kubernetes.io/projected/5e4dd350-9a5b-4626-8b3d-6b9c097b4be1-kube-api-access-j9c94\") pod \"placement-operator-controller-manager-659fb58c6b-fb96d\" (UID: \"5e4dd350-9a5b-4626-8b3d-6b9c097b4be1\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.755193 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dz5w\" (UniqueName: \"kubernetes.io/projected/8cf0ba21-2c05-4e3d-8925-114487cc4998-kube-api-access-2dz5w\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.755306 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.755458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxc5h\" (UniqueName: \"kubernetes.io/projected/a1327184-da65-478d-b7a7-15d0daa3ca95-kube-api-access-fxc5h\") pod \"ovn-operator-controller-manager-846c4cdcb7-l27l2\" (UID: \"a1327184-da65-478d-b7a7-15d0daa3ca95\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" Mar 18 15:54:43 crc kubenswrapper[4792]: E0318 15:54:43.758079 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:43 crc kubenswrapper[4792]: E0318 15:54:43.758373 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert podName:8cf0ba21-2c05-4e3d-8925-114487cc4998 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:44.258349828 +0000 UTC m=+1233.127678765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" (UID: "8cf0ba21-2c05-4e3d-8925-114487cc4998") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.758952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.760240 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.768238 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x4p45" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.782498 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.787953 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dz5w\" (UniqueName: \"kubernetes.io/projected/8cf0ba21-2c05-4e3d-8925-114487cc4998-kube-api-access-2dz5w\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.798393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxc5h\" (UniqueName: \"kubernetes.io/projected/a1327184-da65-478d-b7a7-15d0daa3ca95-kube-api-access-fxc5h\") pod \"ovn-operator-controller-manager-846c4cdcb7-l27l2\" (UID: \"a1327184-da65-478d-b7a7-15d0daa3ca95\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.801134 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.822192 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.823307 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.826139 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8sbhc" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.842096 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.856659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9c94\" (UniqueName: \"kubernetes.io/projected/5e4dd350-9a5b-4626-8b3d-6b9c097b4be1-kube-api-access-j9c94\") pod \"placement-operator-controller-manager-659fb58c6b-fb96d\" (UID: \"5e4dd350-9a5b-4626-8b3d-6b9c097b4be1\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.856706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wl94\" (UniqueName: \"kubernetes.io/projected/5bb05e8f-3780-4bd4-a504-1be6a2887d9f-kube-api-access-2wl94\") pod \"swift-operator-controller-manager-867f54bc44-s4j4w\" (UID: \"5bb05e8f-3780-4bd4-a504-1be6a2887d9f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.856733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzv4n\" (UniqueName: \"kubernetes.io/projected/eb5bab1d-63b4-4ae0-8dfe-734700253a4f-kube-api-access-hzv4n\") pod \"telemetry-operator-controller-manager-5df8f6d8b4-s75wc\" (UID: \"eb5bab1d-63b4-4ae0-8dfe-734700253a4f\") " pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.934747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9c94\" (UniqueName: \"kubernetes.io/projected/5e4dd350-9a5b-4626-8b3d-6b9c097b4be1-kube-api-access-j9c94\") pod \"placement-operator-controller-manager-659fb58c6b-fb96d\" (UID: \"5e4dd350-9a5b-4626-8b3d-6b9c097b4be1\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.949710 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.958939 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl94\" (UniqueName: \"kubernetes.io/projected/5bb05e8f-3780-4bd4-a504-1be6a2887d9f-kube-api-access-2wl94\") pod \"swift-operator-controller-manager-867f54bc44-s4j4w\" (UID: \"5bb05e8f-3780-4bd4-a504-1be6a2887d9f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.959001 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzv4n\" (UniqueName: \"kubernetes.io/projected/eb5bab1d-63b4-4ae0-8dfe-734700253a4f-kube-api-access-hzv4n\") pod \"telemetry-operator-controller-manager-5df8f6d8b4-s75wc\" (UID: \"eb5bab1d-63b4-4ae0-8dfe-734700253a4f\") " pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.959041 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9b86\" (UniqueName: \"kubernetes.io/projected/fbcfdc60-25a6-41e2-8dc1-eb9093393808-kube-api-access-k9b86\") pod \"test-operator-controller-manager-8467ccb4c8-b7zpr\" (UID: \"fbcfdc60-25a6-41e2-8dc1-eb9093393808\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.959126 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rv7\" (UniqueName: \"kubernetes.io/projected/96809e41-8656-4095-a2f9-9d69c31efe61-kube-api-access-j2rv7\") pod \"watcher-operator-controller-manager-74d6f7b5c-z4nr9\" (UID: \"96809e41-8656-4095-a2f9-9d69c31efe61\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.967849 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.974320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.978038 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tjpxb" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.978097 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.979763 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wl94\" (UniqueName: \"kubernetes.io/projected/5bb05e8f-3780-4bd4-a504-1be6a2887d9f-kube-api-access-2wl94\") pod \"swift-operator-controller-manager-867f54bc44-s4j4w\" (UID: \"5bb05e8f-3780-4bd4-a504-1be6a2887d9f\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.981528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzv4n\" (UniqueName: \"kubernetes.io/projected/eb5bab1d-63b4-4ae0-8dfe-734700253a4f-kube-api-access-hzv4n\") pod \"telemetry-operator-controller-manager-5df8f6d8b4-s75wc\" (UID: \"eb5bab1d-63b4-4ae0-8dfe-734700253a4f\") " pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.981566 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.983802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp"] Mar 18 15:54:43 crc kubenswrapper[4792]: I0318 15:54:43.992563 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.011013 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.022636 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.062730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.064110 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9b86\" (UniqueName: \"kubernetes.io/projected/fbcfdc60-25a6-41e2-8dc1-eb9093393808-kube-api-access-k9b86\") pod \"test-operator-controller-manager-8467ccb4c8-b7zpr\" (UID: \"fbcfdc60-25a6-41e2-8dc1-eb9093393808\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.064163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zrr\" (UniqueName: \"kubernetes.io/projected/14667803-000a-4186-8eb1-da78ce4812a0-kube-api-access-x7zrr\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.064229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.064265 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rv7\" (UniqueName: \"kubernetes.io/projected/96809e41-8656-4095-a2f9-9d69c31efe61-kube-api-access-j2rv7\") pod \"watcher-operator-controller-manager-74d6f7b5c-z4nr9\" (UID: \"96809e41-8656-4095-a2f9-9d69c31efe61\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.064340 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.064785 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd"] Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.065999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.069755 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-g4lvs" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.077673 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd"] Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.099750 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9b86\" (UniqueName: \"kubernetes.io/projected/fbcfdc60-25a6-41e2-8dc1-eb9093393808-kube-api-access-k9b86\") pod \"test-operator-controller-manager-8467ccb4c8-b7zpr\" (UID: \"fbcfdc60-25a6-41e2-8dc1-eb9093393808\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.098635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rv7\" (UniqueName: \"kubernetes.io/projected/96809e41-8656-4095-a2f9-9d69c31efe61-kube-api-access-j2rv7\") pod \"watcher-operator-controller-manager-74d6f7b5c-z4nr9\" (UID: \"96809e41-8656-4095-a2f9-9d69c31efe61\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.108554 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.144465 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk"] Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.149940 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.165693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.165802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.165856 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zrr\" (UniqueName: \"kubernetes.io/projected/14667803-000a-4186-8eb1-da78ce4812a0-kube-api-access-x7zrr\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.166813 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.166851 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:44.666838852 +0000 UTC m=+1233.536167789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "metrics-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.167005 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.167047 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:44.667020148 +0000 UTC m=+1233.536349075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.200588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zrr\" (UniqueName: \"kubernetes.io/projected/14667803-000a-4186-8eb1-da78ce4812a0-kube-api-access-x7zrr\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.267837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.267944 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckgz5\" (UniqueName: \"kubernetes.io/projected/f172d1b7-2345-4bf5-ba2e-c142f4f8c482-kube-api-access-ckgz5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b44rd\" (UID: \"f172d1b7-2345-4bf5-ba2e-c142f4f8c482\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.268214 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.268273 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert podName:8cf0ba21-2c05-4e3d-8925-114487cc4998 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:45.268254886 +0000 UTC m=+1234.137583823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" (UID: "8cf0ba21-2c05-4e3d-8925-114487cc4998") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.369485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckgz5\" (UniqueName: \"kubernetes.io/projected/f172d1b7-2345-4bf5-ba2e-c142f4f8c482-kube-api-access-ckgz5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b44rd\" (UID: \"f172d1b7-2345-4bf5-ba2e-c142f4f8c482\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.372994 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w"] Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.393857 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckgz5\" (UniqueName: \"kubernetes.io/projected/f172d1b7-2345-4bf5-ba2e-c142f4f8c482-kube-api-access-ckgz5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b44rd\" (UID: \"f172d1b7-2345-4bf5-ba2e-c142f4f8c482\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.398641 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q"] Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.424478 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6"] Mar 18 15:54:44 crc kubenswrapper[4792]: W0318 15:54:44.463381 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d5f156_656e_4e2f_b368_e841124084d1.slice/crio-bd4187a73aeb3fae5a16240c08d07da6c8b47fc5a804abe3388d4e1924cae70b WatchSource:0}: Error finding container bd4187a73aeb3fae5a16240c08d07da6c8b47fc5a804abe3388d4e1924cae70b: Status 404 returned error can't find the container with id bd4187a73aeb3fae5a16240c08d07da6c8b47fc5a804abe3388d4e1924cae70b Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.523029 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m"] Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.558458 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.674537 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.674609 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.674663 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.674794 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.674807 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.674822 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.674852 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:45.674831214 +0000 UTC m=+1234.544160171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.674895 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:45.674871395 +0000 UTC m=+1234.544200372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "metrics-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: E0318 15:54:44.674919 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert podName:155eb4c3-aa63-4ec7-9824-1bef2045a68b nodeName:}" failed. No retries permitted until 2026-03-18 15:54:46.674909216 +0000 UTC m=+1235.544238223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert") pod "infra-operator-controller-manager-5595c7d6ff-chpgl" (UID: "155eb4c3-aa63-4ec7-9824-1bef2045a68b") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.694250 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw"] Mar 18 15:54:44 crc kubenswrapper[4792]: W0318 15:54:44.696216 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3692a84a_23dc_4b6c_9c20_d97bd0e285d8.slice/crio-14937a01761de58012a04a06a6f6ec2fca620eb7f792780062458edcb704e386 WatchSource:0}: Error finding container 14937a01761de58012a04a06a6f6ec2fca620eb7f792780062458edcb704e386: Status 404 returned error can't find the container with id 14937a01761de58012a04a06a6f6ec2fca620eb7f792780062458edcb704e386 Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.952524 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" event={"ID":"aa2e6c5a-c94a-482a-aceb-156b1cc316d0","Type":"ContainerStarted","Data":"3bdcbabd4a7893de1b6840a913168779351d24600d24defb2588a1b594c46bc5"} Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.955701 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" event={"ID":"af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9","Type":"ContainerStarted","Data":"954b7c63afd27f6c97841ce4fd02280743ecd0984f1f8b4c2ec35277ab47bc9d"} Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.960653 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" event={"ID":"3692a84a-23dc-4b6c-9c20-d97bd0e285d8","Type":"ContainerStarted","Data":"14937a01761de58012a04a06a6f6ec2fca620eb7f792780062458edcb704e386"} Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.964324 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" event={"ID":"d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512","Type":"ContainerStarted","Data":"81637e6b98ab4552eede993a8bbe40eca67c553b2239344bd991f5667ffcdc15"} Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.965801 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" event={"ID":"55d5f156-656e-4e2f-b368-e841124084d1","Type":"ContainerStarted","Data":"bd4187a73aeb3fae5a16240c08d07da6c8b47fc5a804abe3388d4e1924cae70b"} Mar 18 15:54:44 crc kubenswrapper[4792]: I0318 15:54:44.985577 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" event={"ID":"6ccc988b-8909-4e90-b016-c94a1deb2de7","Type":"ContainerStarted","Data":"639ae6e3e63ed02fd1caaed1219f3926099e160d73fd5e3fb3b8af2c8c00f57a"} Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.291531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:45 crc kubenswrapper[4792]: E0318 15:54:45.291721 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:45 crc kubenswrapper[4792]: E0318 15:54:45.291775 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert podName:8cf0ba21-2c05-4e3d-8925-114487cc4998 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:47.291758039 +0000 UTC m=+1236.161086976 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" (UID: "8cf0ba21-2c05-4e3d-8925-114487cc4998") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.292119 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.356062 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn"] Mar 18 15:54:45 crc kubenswrapper[4792]: W0318 15:54:45.365846 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbab4be0_7cc0_4e9c_8f84_5c9fd04d4e3b.slice/crio-7eca792ded8e352adbaa51ba3d2319027a8d11d240225401111c9005f5310ac0 WatchSource:0}: Error finding container 7eca792ded8e352adbaa51ba3d2319027a8d11d240225401111c9005f5310ac0: Status 404 returned error can't find the container with id 7eca792ded8e352adbaa51ba3d2319027a8d11d240225401111c9005f5310ac0 Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.382016 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.431124 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.470824 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.519760 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.561472 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.724828 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.724959 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:45 crc kubenswrapper[4792]: E0318 15:54:45.725141 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:45 crc kubenswrapper[4792]: E0318 15:54:45.725191 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:47.725174367 +0000 UTC m=+1236.594503304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "metrics-server-cert" not found Mar 18 15:54:45 crc kubenswrapper[4792]: E0318 15:54:45.725519 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:45 crc kubenswrapper[4792]: E0318 15:54:45.725542 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:47.725534937 +0000 UTC m=+1236.594863874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "webhook-server-cert" not found Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.775459 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.795926 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr"] Mar 18 15:54:45 crc kubenswrapper[4792]: W0318 15:54:45.799602 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb5bab1d_63b4_4ae0_8dfe_734700253a4f.slice/crio-4278ef1e7d7c459915a4f34f108c70b281d653f4e2e7921cdc2075db842bae58 WatchSource:0}: Error finding container 4278ef1e7d7c459915a4f34f108c70b281d653f4e2e7921cdc2075db842bae58: Status 404 returned error can't find the container with id 4278ef1e7d7c459915a4f34f108c70b281d653f4e2e7921cdc2075db842bae58 Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.812051 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h"] Mar 18 15:54:45 crc kubenswrapper[4792]: W0318 15:54:45.822605 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bb05e8f_3780_4bd4_a504_1be6a2887d9f.slice/crio-3853575ca7091fb9ed1aa1961c336c5b536f3a5e871bebe61029bbdb1a7e3b08 WatchSource:0}: Error finding container 3853575ca7091fb9ed1aa1961c336c5b536f3a5e871bebe61029bbdb1a7e3b08: Status 404 returned error can't find the container with id 3853575ca7091fb9ed1aa1961c336c5b536f3a5e871bebe61029bbdb1a7e3b08 Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.828924 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w"] Mar 18 15:54:45 crc kubenswrapper[4792]: I0318 15:54:45.841738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d"] Mar 18 15:54:45 crc kubenswrapper[4792]: W0318 15:54:45.847814 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4dd350_9a5b_4626_8b3d_6b9c097b4be1.slice/crio-3ff71ec34026a3c1646fd2df6c955efb2654ea858cd870ca5ec28f578e9124dc WatchSource:0}: Error finding container 3ff71ec34026a3c1646fd2df6c955efb2654ea858cd870ca5ec28f578e9124dc: Status 404 returned error can't find the container with id 3ff71ec34026a3c1646fd2df6c955efb2654ea858cd870ca5ec28f578e9124dc Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:45.998230 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" event={"ID":"abc215c2-57eb-4c7a-b19d-0ed3ccd67001","Type":"ContainerStarted","Data":"5e22b0ea0a87310fef734de665e6ee59219cf15b2a32de65b3671f16d9fd9706"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.001556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" event={"ID":"05dba0ab-e659-4e0c-8713-4eebeca6edba","Type":"ContainerStarted","Data":"a3c4ffaf5137c868849cb1cb33591eb8295a07c1de9b8c4739b75f34afdf4d2f"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.005027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" event={"ID":"fbcfdc60-25a6-41e2-8dc1-eb9093393808","Type":"ContainerStarted","Data":"b33cf4b967d7133a717f067cdbb59b549db7bf02b8d96d113062040d25e32fa0"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.013443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" event={"ID":"5bb05e8f-3780-4bd4-a504-1be6a2887d9f","Type":"ContainerStarted","Data":"3853575ca7091fb9ed1aa1961c336c5b536f3a5e871bebe61029bbdb1a7e3b08"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.014605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" event={"ID":"eb5bab1d-63b4-4ae0-8dfe-734700253a4f","Type":"ContainerStarted","Data":"4278ef1e7d7c459915a4f34f108c70b281d653f4e2e7921cdc2075db842bae58"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.015398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" event={"ID":"5e4dd350-9a5b-4626-8b3d-6b9c097b4be1","Type":"ContainerStarted","Data":"3ff71ec34026a3c1646fd2df6c955efb2654ea858cd870ca5ec28f578e9124dc"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.016564 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" event={"ID":"a518542e-e1c4-4754-9031-d3f1571abb27","Type":"ContainerStarted","Data":"2d1382f27affa5a696da79eef7a4ad30c8b74f9cd2f25e75565a5098cc1f7706"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.017937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" event={"ID":"a1327184-da65-478d-b7a7-15d0daa3ca95","Type":"ContainerStarted","Data":"f13becccd5e21f01f64f835acde5e6cdd7b8c92c080387914b556db96a277611"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.021572 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" event={"ID":"dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b","Type":"ContainerStarted","Data":"7eca792ded8e352adbaa51ba3d2319027a8d11d240225401111c9005f5310ac0"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.022250 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9"] Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.024516 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" event={"ID":"dd73a890-f234-415f-b99a-685059be7d48","Type":"ContainerStarted","Data":"59c1c857b85c61543a5eea75660c386827e25055796e6eb45ad922cf08936e76"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.026813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" event={"ID":"65722e7d-1557-437c-ae5c-383082933c8c","Type":"ContainerStarted","Data":"7fcbbcae2466d0721c77d92f24312e2242858a913b19aac278e95176e11fcd32"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.029730 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" event={"ID":"79896742-17fd-4960-ae5b-af3c83550a4e","Type":"ContainerStarted","Data":"b22093606732518f4dc0fbabb0d3d7d36c373ec3d951020f7c1b2a66c9a84965"} Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.050360 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd"] Mar 18 15:54:46 crc kubenswrapper[4792]: E0318 15:54:46.122689 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ckgz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-b44rd_openstack-operators(f172d1b7-2345-4bf5-ba2e-c142f4f8c482): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 15:54:46 crc kubenswrapper[4792]: E0318 15:54:46.125283 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" podUID="f172d1b7-2345-4bf5-ba2e-c142f4f8c482" Mar 18 15:54:46 crc kubenswrapper[4792]: I0318 15:54:46.749162 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:46 crc kubenswrapper[4792]: E0318 15:54:46.749285 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:46 crc kubenswrapper[4792]: E0318 15:54:46.749556 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert podName:155eb4c3-aa63-4ec7-9824-1bef2045a68b nodeName:}" failed. No retries permitted until 2026-03-18 15:54:50.749537055 +0000 UTC m=+1239.618865992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert") pod "infra-operator-controller-manager-5595c7d6ff-chpgl" (UID: "155eb4c3-aa63-4ec7-9824-1bef2045a68b") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:47 crc kubenswrapper[4792]: I0318 15:54:47.057714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" event={"ID":"96809e41-8656-4095-a2f9-9d69c31efe61","Type":"ContainerStarted","Data":"a31a2b44737219ff46fa8c7355d28ec5b6efd7f430a3458631c34ddab7fcd76e"} Mar 18 15:54:47 crc kubenswrapper[4792]: I0318 15:54:47.080010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" event={"ID":"f172d1b7-2345-4bf5-ba2e-c142f4f8c482","Type":"ContainerStarted","Data":"58b8369218cb7857aa0dd50f24085a453b03e00036c7c616b9fc87a0f2c5db67"} Mar 18 15:54:47 crc kubenswrapper[4792]: E0318 15:54:47.082770 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" podUID="f172d1b7-2345-4bf5-ba2e-c142f4f8c482" Mar 18 15:54:47 crc kubenswrapper[4792]: I0318 15:54:47.368124 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:47 crc kubenswrapper[4792]: E0318 15:54:47.374302 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:47 crc kubenswrapper[4792]: E0318 15:54:47.374413 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert podName:8cf0ba21-2c05-4e3d-8925-114487cc4998 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:51.374382103 +0000 UTC m=+1240.243711040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" (UID: "8cf0ba21-2c05-4e3d-8925-114487cc4998") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:47 crc kubenswrapper[4792]: I0318 15:54:47.782520 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:47 crc kubenswrapper[4792]: I0318 15:54:47.782689 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:47 crc kubenswrapper[4792]: E0318 15:54:47.782685 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:47 crc kubenswrapper[4792]: E0318 15:54:47.782776 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:51.782750464 +0000 UTC m=+1240.652079471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "webhook-server-cert" not found Mar 18 15:54:47 crc kubenswrapper[4792]: E0318 15:54:47.782954 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:47 crc kubenswrapper[4792]: E0318 15:54:47.783058 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:51.783043703 +0000 UTC m=+1240.652372640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "metrics-server-cert" not found Mar 18 15:54:48 crc kubenswrapper[4792]: E0318 15:54:48.141860 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" podUID="f172d1b7-2345-4bf5-ba2e-c142f4f8c482" Mar 18 15:54:50 crc kubenswrapper[4792]: I0318 15:54:50.848055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:50 crc kubenswrapper[4792]: E0318 15:54:50.848210 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:50 crc kubenswrapper[4792]: E0318 15:54:50.848724 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert podName:155eb4c3-aa63-4ec7-9824-1bef2045a68b nodeName:}" failed. No retries permitted until 2026-03-18 15:54:58.848701769 +0000 UTC m=+1247.718030706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert") pod "infra-operator-controller-manager-5595c7d6ff-chpgl" (UID: "155eb4c3-aa63-4ec7-9824-1bef2045a68b") : secret "infra-operator-webhook-server-cert" not found Mar 18 15:54:51 crc kubenswrapper[4792]: I0318 15:54:51.459055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:51 crc kubenswrapper[4792]: E0318 15:54:51.459269 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:51 crc kubenswrapper[4792]: E0318 15:54:51.459452 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert podName:8cf0ba21-2c05-4e3d-8925-114487cc4998 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:59.459429431 +0000 UTC m=+1248.328758368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" (UID: "8cf0ba21-2c05-4e3d-8925-114487cc4998") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 15:54:51 crc kubenswrapper[4792]: I0318 15:54:51.866330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:51 crc kubenswrapper[4792]: I0318 15:54:51.866491 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:51 crc kubenswrapper[4792]: E0318 15:54:51.866653 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 15:54:51 crc kubenswrapper[4792]: E0318 15:54:51.866710 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:59.86669137 +0000 UTC m=+1248.736020307 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "metrics-server-cert" not found Mar 18 15:54:51 crc kubenswrapper[4792]: E0318 15:54:51.867114 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 15:54:51 crc kubenswrapper[4792]: E0318 15:54:51.867146 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs podName:14667803-000a-4186-8eb1-da78ce4812a0 nodeName:}" failed. No retries permitted until 2026-03-18 15:54:59.867137583 +0000 UTC m=+1248.736466520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs") pod "openstack-operator-controller-manager-c79f466d7-95zwp" (UID: "14667803-000a-4186-8eb1-da78ce4812a0") : secret "webhook-server-cert" not found Mar 18 15:54:58 crc kubenswrapper[4792]: I0318 15:54:58.896571 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:58 crc kubenswrapper[4792]: I0318 15:54:58.903759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/155eb4c3-aa63-4ec7-9824-1bef2045a68b-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-chpgl\" (UID: \"155eb4c3-aa63-4ec7-9824-1bef2045a68b\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.099429 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.507757 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.513156 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cf0ba21-2c05-4e3d-8925-114487cc4998-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp\" (UID: \"8cf0ba21-2c05-4e3d-8925-114487cc4998\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.750291 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.914727 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.914850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.917864 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-metrics-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:54:59 crc kubenswrapper[4792]: I0318 15:54:59.918351 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14667803-000a-4186-8eb1-da78ce4812a0-webhook-certs\") pod \"openstack-operator-controller-manager-c79f466d7-95zwp\" (UID: \"14667803-000a-4186-8eb1-da78ce4812a0\") " pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:55:00 crc kubenswrapper[4792]: I0318 15:55:00.130839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:55:00 crc kubenswrapper[4792]: E0318 15:55:00.152136 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:e55a2b7d1ddf3e48be378bf8d3e844969e957a478171badd8c7d69f9e89bfb2e" Mar 18 15:55:00 crc kubenswrapper[4792]: E0318 15:55:00.152381 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:e55a2b7d1ddf3e48be378bf8d3e844969e957a478171badd8c7d69f9e89bfb2e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqwg7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6cc65c69fc-tls5q_openstack-operators(6ccc988b-8909-4e90-b016-c94a1deb2de7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:00 crc kubenswrapper[4792]: E0318 15:55:00.153684 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" podUID="6ccc988b-8909-4e90-b016-c94a1deb2de7" Mar 18 15:55:00 crc kubenswrapper[4792]: E0318 15:55:00.281583 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:e55a2b7d1ddf3e48be378bf8d3e844969e957a478171badd8c7d69f9e89bfb2e\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" podUID="6ccc988b-8909-4e90-b016-c94a1deb2de7" Mar 18 15:55:00 crc kubenswrapper[4792]: E0318 15:55:00.725505 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258" Mar 18 15:55:00 crc kubenswrapper[4792]: E0318 15:55:00.725994 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-brnzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-76b87776c9-mf8vn_openstack-operators(dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:00 crc kubenswrapper[4792]: E0318 15:55:00.727192 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" podUID="dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b" Mar 18 15:55:01 crc kubenswrapper[4792]: E0318 15:55:01.240165 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702" Mar 18 15:55:01 crc kubenswrapper[4792]: E0318 15:55:01.240342 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fxc5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-846c4cdcb7-l27l2_openstack-operators(a1327184-da65-478d-b7a7-15d0daa3ca95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:01 crc kubenswrapper[4792]: E0318 15:55:01.242033 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" podUID="a1327184-da65-478d-b7a7-15d0daa3ca95" Mar 18 15:55:01 crc kubenswrapper[4792]: E0318 15:55:01.294939 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" podUID="a1327184-da65-478d-b7a7-15d0daa3ca95" Mar 18 15:55:01 crc kubenswrapper[4792]: E0318 15:55:01.294992 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" podUID="dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b" Mar 18 15:55:03 crc kubenswrapper[4792]: E0318 15:55:03.332179 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1" Mar 18 15:55:03 crc kubenswrapper[4792]: E0318 15:55:03.332829 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jmtgd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-66dd9d474d-nvs5w_openstack-operators(af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:03 crc kubenswrapper[4792]: E0318 15:55:03.334109 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" podUID="af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9" Mar 18 15:55:03 crc kubenswrapper[4792]: E0318 15:55:03.983828 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6" Mar 18 15:55:03 crc kubenswrapper[4792]: E0318 15:55:03.984028 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4cdn8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6744dd545c-77v8x_openstack-operators(a518542e-e1c4-4754-9031-d3f1571abb27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:03 crc kubenswrapper[4792]: E0318 15:55:03.985280 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" podUID="a518542e-e1c4-4754-9031-d3f1571abb27" Mar 18 15:55:04 crc kubenswrapper[4792]: E0318 15:55:04.325168 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1\\\"\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" podUID="af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9" Mar 18 15:55:04 crc kubenswrapper[4792]: E0318 15:55:04.325580 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" podUID="a518542e-e1c4-4754-9031-d3f1571abb27" Mar 18 15:55:05 crc kubenswrapper[4792]: E0318 15:55:05.217470 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396" Mar 18 15:55:05 crc kubenswrapper[4792]: E0318 15:55:05.217907 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vqz2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6d77645966-xzpbw_openstack-operators(3692a84a-23dc-4b6c-9c20-d97bd0e285d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:05 crc kubenswrapper[4792]: E0318 15:55:05.219653 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" podUID="3692a84a-23dc-4b6c-9c20-d97bd0e285d8" Mar 18 15:55:05 crc kubenswrapper[4792]: E0318 15:55:05.329423 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" podUID="3692a84a-23dc-4b6c-9c20-d97bd0e285d8" Mar 18 15:55:07 crc kubenswrapper[4792]: E0318 15:55:07.074112 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27" Mar 18 15:55:07 crc kubenswrapper[4792]: E0318 15:55:07.074564 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k9b86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-b7zpr_openstack-operators(fbcfdc60-25a6-41e2-8dc1-eb9093393808): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:07 crc kubenswrapper[4792]: E0318 15:55:07.075751 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" podUID="fbcfdc60-25a6-41e2-8dc1-eb9093393808" Mar 18 15:55:07 crc kubenswrapper[4792]: E0318 15:55:07.343627 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" podUID="fbcfdc60-25a6-41e2-8dc1-eb9093393808" Mar 18 15:55:07 crc kubenswrapper[4792]: E0318 15:55:07.610638 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a" Mar 18 15:55:07 crc kubenswrapper[4792]: E0318 15:55:07.610811 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-92jgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-fbf7bbb96-kmpvr_openstack-operators(dd73a890-f234-415f-b99a-685059be7d48): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:07 crc kubenswrapper[4792]: E0318 15:55:07.612018 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" podUID="dd73a890-f234-415f-b99a-685059be7d48" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.141615 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.141842 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-92glk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-56f74467c6-5c65h_openstack-operators(05dba0ab-e659-4e0c-8713-4eebeca6edba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.143059 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" podUID="05dba0ab-e659-4e0c-8713-4eebeca6edba" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.353420 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" podUID="dd73a890-f234-415f-b99a-685059be7d48" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.354139 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b0ba0389a96140174eaad4ad8cc3e98118472d640bdca18046877e973f009ff4\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" podUID="05dba0ab-e659-4e0c-8713-4eebeca6edba" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.630055 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.630225 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ntkcw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6f5b7bcd4-n4j9l_openstack-operators(65722e7d-1557-437c-ae5c-383082933c8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:08 crc kubenswrapper[4792]: E0318 15:55:08.631724 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" podUID="65722e7d-1557-437c-ae5c-383082933c8c" Mar 18 15:55:09 crc kubenswrapper[4792]: E0318 15:55:09.357376 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:4f487da837018bfcd11dd794ba8f4dacc839b92e0d060c146fd1f771d750abf8\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" podUID="65722e7d-1557-437c-ae5c-383082933c8c" Mar 18 15:55:09 crc kubenswrapper[4792]: E0318 15:55:09.858159 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552" Mar 18 15:55:09 crc kubenswrapper[4792]: E0318 15:55:09.858366 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cnxzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-bc5c78db9-hpr6x_openstack-operators(abc215c2-57eb-4c7a-b19d-0ed3ccd67001): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:09 crc kubenswrapper[4792]: E0318 15:55:09.859644 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" podUID="abc215c2-57eb-4c7a-b19d-0ed3ccd67001" Mar 18 15:55:10 crc kubenswrapper[4792]: E0318 15:55:10.364332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" podUID="abc215c2-57eb-4c7a-b19d-0ed3ccd67001" Mar 18 15:55:10 crc kubenswrapper[4792]: E0318 15:55:10.372423 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a" Mar 18 15:55:10 crc kubenswrapper[4792]: E0318 15:55:10.372792 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j9c94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-659fb58c6b-fb96d_openstack-operators(5e4dd350-9a5b-4626-8b3d-6b9c097b4be1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:10 crc kubenswrapper[4792]: E0318 15:55:10.376427 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" podUID="5e4dd350-9a5b-4626-8b3d-6b9c097b4be1" Mar 18 15:55:11 crc kubenswrapper[4792]: E0318 15:55:11.104232 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953" Mar 18 15:55:11 crc kubenswrapper[4792]: E0318 15:55:11.104740 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2rv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-74d6f7b5c-z4nr9_openstack-operators(96809e41-8656-4095-a2f9-9d69c31efe61): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:11 crc kubenswrapper[4792]: E0318 15:55:11.106372 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" podUID="96809e41-8656-4095-a2f9-9d69c31efe61" Mar 18 15:55:11 crc kubenswrapper[4792]: E0318 15:55:11.371961 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" podUID="5e4dd350-9a5b-4626-8b3d-6b9c097b4be1" Mar 18 15:55:11 crc kubenswrapper[4792]: E0318 15:55:11.372793 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" podUID="96809e41-8656-4095-a2f9-9d69c31efe61" Mar 18 15:55:12 crc kubenswrapper[4792]: E0318 15:55:12.246006 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267" Mar 18 15:55:12 crc kubenswrapper[4792]: E0318 15:55:12.246067 4792 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267" Mar 18 15:55:12 crc kubenswrapper[4792]: E0318 15:55:12.246188 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hzv4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5df8f6d8b4-s75wc_openstack-operators(eb5bab1d-63b4-4ae0-8dfe-734700253a4f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:55:12 crc kubenswrapper[4792]: E0318 15:55:12.247477 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" podUID="eb5bab1d-63b4-4ae0-8dfe-734700253a4f" Mar 18 15:55:12 crc kubenswrapper[4792]: E0318 15:55:12.383746 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" podUID="eb5bab1d-63b4-4ae0-8dfe-734700253a4f" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.241519 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl"] Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.253421 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp"] Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.396084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" event={"ID":"55d5f156-656e-4e2f-b368-e841124084d1","Type":"ContainerStarted","Data":"0194a74c5ba11d6621ae56450a8bb398a5a240f750e610a890b4f6f57d651b88"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.397180 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.399720 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" event={"ID":"d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512","Type":"ContainerStarted","Data":"65eba7399a3f8c7ae46488ad152addb6c5e00cabe6dd5f8816bc3ae525da88aa"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.399921 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.406912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" event={"ID":"aa2e6c5a-c94a-482a-aceb-156b1cc316d0","Type":"ContainerStarted","Data":"cd1639adb94dede806c79533a3b90bd2adf1d1a5fd9c960a9c7bbe7f5da26356"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.407254 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.411376 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp"] Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.420992 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" podStartSLOduration=3.681477692 podStartE2EDuration="31.420963161s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:44.4907579 +0000 UTC m=+1233.360086837" lastFinishedPulling="2026-03-18 15:55:12.230243369 +0000 UTC m=+1261.099572306" observedRunningTime="2026-03-18 15:55:13.415458649 +0000 UTC m=+1262.284787606" watchObservedRunningTime="2026-03-18 15:55:13.420963161 +0000 UTC m=+1262.290292098" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.423586 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" event={"ID":"79896742-17fd-4960-ae5b-af3c83550a4e","Type":"ContainerStarted","Data":"6051853786fac981d70c57714eb764df419ccb07a9c57421ccaf4a2744712ebb"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.423814 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.425286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" event={"ID":"f172d1b7-2345-4bf5-ba2e-c142f4f8c482","Type":"ContainerStarted","Data":"22f4ffc57c213e89496b49ae7aa694fba387e43ef9f66d8b2ccedc9b2f56200f"} Mar 18 15:55:16 crc kubenswrapper[4792]: W0318 15:55:13.428023 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf0ba21_2c05_4e3d_8925_114487cc4998.slice/crio-32d21bd67722090aebb02a2468041349c8aacf0399f1c67fa4a840393e8d3b4d WatchSource:0}: Error finding container 32d21bd67722090aebb02a2468041349c8aacf0399f1c67fa4a840393e8d3b4d: Status 404 returned error can't find the container with id 32d21bd67722090aebb02a2468041349c8aacf0399f1c67fa4a840393e8d3b4d Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.453093 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" event={"ID":"6ccc988b-8909-4e90-b016-c94a1deb2de7","Type":"ContainerStarted","Data":"5c1ed0778cf39cb1e34f5efee717de200fbd24c28b474cd67f0a834df1b92f4c"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.453292 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.456052 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" podStartSLOduration=3.75112157 podStartE2EDuration="31.456027082s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:44.529269043 +0000 UTC m=+1233.398597980" lastFinishedPulling="2026-03-18 15:55:12.234174555 +0000 UTC m=+1261.103503492" observedRunningTime="2026-03-18 15:55:13.453197718 +0000 UTC m=+1262.322526655" watchObservedRunningTime="2026-03-18 15:55:13.456027082 +0000 UTC m=+1262.325356019" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.484587 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" podStartSLOduration=4.328481311 podStartE2EDuration="31.484557541s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:43.967567482 +0000 UTC m=+1232.836896419" lastFinishedPulling="2026-03-18 15:55:11.123643712 +0000 UTC m=+1259.992972649" observedRunningTime="2026-03-18 15:55:13.470786106 +0000 UTC m=+1262.340115043" watchObservedRunningTime="2026-03-18 15:55:13.484557541 +0000 UTC m=+1262.353886478" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.500314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" event={"ID":"155eb4c3-aa63-4ec7-9824-1bef2045a68b","Type":"ContainerStarted","Data":"98cf81e452f1293b08df82ddfadb57fb7e6b44ea95c3160c87e847b299bbcdf5"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.512302 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b44rd" podStartSLOduration=3.730618169 podStartE2EDuration="30.512283416s" podCreationTimestamp="2026-03-18 15:54:43 +0000 UTC" firstStartedPulling="2026-03-18 15:54:46.122475372 +0000 UTC m=+1234.991804309" lastFinishedPulling="2026-03-18 15:55:12.904140619 +0000 UTC m=+1261.773469556" observedRunningTime="2026-03-18 15:55:13.500379166 +0000 UTC m=+1262.369708103" watchObservedRunningTime="2026-03-18 15:55:13.512283416 +0000 UTC m=+1262.381612353" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.525553 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" podStartSLOduration=4.739496959 podStartE2EDuration="31.525513925s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.448155489 +0000 UTC m=+1234.317484426" lastFinishedPulling="2026-03-18 15:55:12.234172455 +0000 UTC m=+1261.103501392" observedRunningTime="2026-03-18 15:55:13.519776467 +0000 UTC m=+1262.389105404" watchObservedRunningTime="2026-03-18 15:55:13.525513925 +0000 UTC m=+1262.394842862" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.527409 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" event={"ID":"5bb05e8f-3780-4bd4-a504-1be6a2887d9f","Type":"ContainerStarted","Data":"f09b929d071c706148d64c0f7da1a5c2937ef1f3d62eeccc145e2cb68c8be07f"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.527661 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.533925 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" event={"ID":"14667803-000a-4186-8eb1-da78ce4812a0","Type":"ContainerStarted","Data":"eefd3d8926b1db5f30284d6e8dcec6c25d11ad186c3e018020d49bea24046a7c"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.545949 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" podStartSLOduration=3.083758471 podStartE2EDuration="31.545930076s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:44.434896807 +0000 UTC m=+1233.304225744" lastFinishedPulling="2026-03-18 15:55:12.897068422 +0000 UTC m=+1261.766397349" observedRunningTime="2026-03-18 15:55:13.539592399 +0000 UTC m=+1262.408921356" watchObservedRunningTime="2026-03-18 15:55:13.545930076 +0000 UTC m=+1262.415259013" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:13.566794 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" podStartSLOduration=5.169779485 podStartE2EDuration="31.566775269s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.833281386 +0000 UTC m=+1234.702610323" lastFinishedPulling="2026-03-18 15:55:12.23027717 +0000 UTC m=+1261.099606107" observedRunningTime="2026-03-18 15:55:13.565643676 +0000 UTC m=+1262.434972613" watchObservedRunningTime="2026-03-18 15:55:13.566775269 +0000 UTC m=+1262.436104206" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:14.554850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" event={"ID":"14667803-000a-4186-8eb1-da78ce4812a0","Type":"ContainerStarted","Data":"e34e2b69c564a9434b089fa48e66efc96e8a61cdf3a2d2f004c36f2a237544df"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:14.556126 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:14.557615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" event={"ID":"8cf0ba21-2c05-4e3d-8925-114487cc4998","Type":"ContainerStarted","Data":"32d21bd67722090aebb02a2468041349c8aacf0399f1c67fa4a840393e8d3b4d"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:14.559582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" event={"ID":"dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b","Type":"ContainerStarted","Data":"bdfabdf22d16abc73a1c5eadbc585866f5d16f8ef3f9c63fb455e62687390c31"} Mar 18 15:55:16 crc kubenswrapper[4792]: I0318 15:55:14.597586 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" podStartSLOduration=31.597567907 podStartE2EDuration="31.597567907s" podCreationTimestamp="2026-03-18 15:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:55:14.593429775 +0000 UTC m=+1263.462758732" watchObservedRunningTime="2026-03-18 15:55:14.597567907 +0000 UTC m=+1263.466896844" Mar 18 15:55:17 crc kubenswrapper[4792]: I0318 15:55:17.605052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" event={"ID":"a1327184-da65-478d-b7a7-15d0daa3ca95","Type":"ContainerStarted","Data":"c78471694c57d26f8bccc0400a445bfaaafeec953d496ef223980dd4b7b7a455"} Mar 18 15:55:17 crc kubenswrapper[4792]: I0318 15:55:17.606329 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" Mar 18 15:55:17 crc kubenswrapper[4792]: I0318 15:55:17.606912 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" Mar 18 15:55:17 crc kubenswrapper[4792]: I0318 15:55:17.633331 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" podStartSLOduration=7.621262777 podStartE2EDuration="35.633304123s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.43935101 +0000 UTC m=+1234.308679947" lastFinishedPulling="2026-03-18 15:55:13.451392356 +0000 UTC m=+1262.320721293" observedRunningTime="2026-03-18 15:55:17.62845436 +0000 UTC m=+1266.497783297" watchObservedRunningTime="2026-03-18 15:55:17.633304123 +0000 UTC m=+1266.502633060" Mar 18 15:55:17 crc kubenswrapper[4792]: I0318 15:55:17.673588 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" podStartSLOduration=4.246128709 podStartE2EDuration="35.673568838s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.439756692 +0000 UTC m=+1234.309085639" lastFinishedPulling="2026-03-18 15:55:16.867196831 +0000 UTC m=+1265.736525768" observedRunningTime="2026-03-18 15:55:17.669498817 +0000 UTC m=+1266.538827774" watchObservedRunningTime="2026-03-18 15:55:17.673568838 +0000 UTC m=+1266.542897775" Mar 18 15:55:18 crc kubenswrapper[4792]: I0318 15:55:18.616067 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" Mar 18 15:55:20 crc kubenswrapper[4792]: I0318 15:55:20.138265 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.642361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" event={"ID":"af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9","Type":"ContainerStarted","Data":"74a8ebb89bb056b66aa7f89f586396df44b1214000d5dfce398e3a36854929ac"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.644094 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.646037 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" event={"ID":"fbcfdc60-25a6-41e2-8dc1-eb9093393808","Type":"ContainerStarted","Data":"30bd6beaef6a77d24b34b0a75c102d53bf848328f730ff39fcfd76d7baf6ed05"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.646905 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.648129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" event={"ID":"a518542e-e1c4-4754-9031-d3f1571abb27","Type":"ContainerStarted","Data":"c69eb81e8463083d08881edc2dc54f799868828e2a8cb577543fff70398f39d9"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.648421 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.651255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" event={"ID":"05dba0ab-e659-4e0c-8713-4eebeca6edba","Type":"ContainerStarted","Data":"5427ca2d12b9cec1ba004075d94d14c0c5b70c62c051f4dfefc1235af6d79492"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.652191 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.654902 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" event={"ID":"8cf0ba21-2c05-4e3d-8925-114487cc4998","Type":"ContainerStarted","Data":"1ad34a75b919ab47351fb5a8af42351a451242dbb82605331b9aeb32d8ed9722"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.655972 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.658820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" event={"ID":"3692a84a-23dc-4b6c-9c20-d97bd0e285d8","Type":"ContainerStarted","Data":"4421da450b60d9f0a6b0d0289a8fa1551c3bcfb4b69fbe9247d062f7b0bd4b2d"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.659172 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.660536 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" event={"ID":"dd73a890-f234-415f-b99a-685059be7d48","Type":"ContainerStarted","Data":"e37398e55130478d3d6be4857674eb3d6257e964601fa34ce37e9b943c909fd7"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.660794 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.661905 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" event={"ID":"155eb4c3-aa63-4ec7-9824-1bef2045a68b","Type":"ContainerStarted","Data":"4ccc77724e23a909585da84645728e208683f320e8de18a71cd307a11ce03b95"} Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.662830 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.671222 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" podStartSLOduration=3.759511455 podStartE2EDuration="39.671200815s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:44.435228026 +0000 UTC m=+1233.304556963" lastFinishedPulling="2026-03-18 15:55:20.346917386 +0000 UTC m=+1269.216246323" observedRunningTime="2026-03-18 15:55:21.666748394 +0000 UTC m=+1270.536077351" watchObservedRunningTime="2026-03-18 15:55:21.671200815 +0000 UTC m=+1270.540529752" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.773628 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" podStartSLOduration=3.901193393 podStartE2EDuration="39.773599967s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.525425381 +0000 UTC m=+1234.394754318" lastFinishedPulling="2026-03-18 15:55:21.397831955 +0000 UTC m=+1270.267160892" observedRunningTime="2026-03-18 15:55:21.712949903 +0000 UTC m=+1270.582278850" watchObservedRunningTime="2026-03-18 15:55:21.773599967 +0000 UTC m=+1270.642928904" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.783234 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" podStartSLOduration=32.700290689 podStartE2EDuration="39.78321062s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:55:13.262311235 +0000 UTC m=+1262.131640172" lastFinishedPulling="2026-03-18 15:55:20.345231166 +0000 UTC m=+1269.214560103" observedRunningTime="2026-03-18 15:55:21.741551374 +0000 UTC m=+1270.610880331" watchObservedRunningTime="2026-03-18 15:55:21.78321062 +0000 UTC m=+1270.652539557" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.810511 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" podStartSLOduration=32.91757827 podStartE2EDuration="39.810489512s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:55:13.452609582 +0000 UTC m=+1262.321938519" lastFinishedPulling="2026-03-18 15:55:20.345520824 +0000 UTC m=+1269.214849761" observedRunningTime="2026-03-18 15:55:21.780455469 +0000 UTC m=+1270.649784416" watchObservedRunningTime="2026-03-18 15:55:21.810489512 +0000 UTC m=+1270.679818449" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.825155 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" podStartSLOduration=4.758395146 podStartE2EDuration="39.825131393s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.349131846 +0000 UTC m=+1234.218460783" lastFinishedPulling="2026-03-18 15:55:20.415868093 +0000 UTC m=+1269.285197030" observedRunningTime="2026-03-18 15:55:21.808423801 +0000 UTC m=+1270.677752758" watchObservedRunningTime="2026-03-18 15:55:21.825131393 +0000 UTC m=+1270.694460340" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.833410 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" podStartSLOduration=4.186835194 podStartE2EDuration="39.833386815s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:44.699031455 +0000 UTC m=+1233.568360392" lastFinishedPulling="2026-03-18 15:55:20.345583076 +0000 UTC m=+1269.214912013" observedRunningTime="2026-03-18 15:55:21.831562412 +0000 UTC m=+1270.700891349" watchObservedRunningTime="2026-03-18 15:55:21.833386815 +0000 UTC m=+1270.702715752" Mar 18 15:55:21 crc kubenswrapper[4792]: I0318 15:55:21.883473 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" podStartSLOduration=4.360495195 podStartE2EDuration="38.883451718s" podCreationTimestamp="2026-03-18 15:54:43 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.822556731 +0000 UTC m=+1234.691885668" lastFinishedPulling="2026-03-18 15:55:20.345513254 +0000 UTC m=+1269.214842191" observedRunningTime="2026-03-18 15:55:21.875400911 +0000 UTC m=+1270.744729858" watchObservedRunningTime="2026-03-18 15:55:21.883451718 +0000 UTC m=+1270.752780655" Mar 18 15:55:22 crc kubenswrapper[4792]: I0318 15:55:22.672115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" event={"ID":"abc215c2-57eb-4c7a-b19d-0ed3ccd67001","Type":"ContainerStarted","Data":"d5c5b03fca2d9c4a7c560929e8085b40ef22494b9e266c2fe7264821d8329fd4"} Mar 18 15:55:22 crc kubenswrapper[4792]: I0318 15:55:22.673641 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" Mar 18 15:55:22 crc kubenswrapper[4792]: I0318 15:55:22.695375 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" podStartSLOduration=4.453758776 podStartE2EDuration="40.695355737s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.486739384 +0000 UTC m=+1234.356068321" lastFinishedPulling="2026-03-18 15:55:21.728336345 +0000 UTC m=+1270.597665282" observedRunningTime="2026-03-18 15:55:22.692706389 +0000 UTC m=+1271.562035336" watchObservedRunningTime="2026-03-18 15:55:22.695355737 +0000 UTC m=+1271.564684674" Mar 18 15:55:22 crc kubenswrapper[4792]: I0318 15:55:22.696478 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" podStartSLOduration=6.097108589 podStartE2EDuration="40.696471781s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.818216102 +0000 UTC m=+1234.687545039" lastFinishedPulling="2026-03-18 15:55:20.417579294 +0000 UTC m=+1269.286908231" observedRunningTime="2026-03-18 15:55:21.902206409 +0000 UTC m=+1270.771535346" watchObservedRunningTime="2026-03-18 15:55:22.696471781 +0000 UTC m=+1271.565800718" Mar 18 15:55:23 crc kubenswrapper[4792]: I0318 15:55:23.015034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" Mar 18 15:55:23 crc kubenswrapper[4792]: I0318 15:55:23.084581 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" Mar 18 15:55:23 crc kubenswrapper[4792]: I0318 15:55:23.207704 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" Mar 18 15:55:23 crc kubenswrapper[4792]: I0318 15:55:23.244695 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 15:55:23 crc kubenswrapper[4792]: I0318 15:55:23.260416 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" Mar 18 15:55:23 crc kubenswrapper[4792]: I0318 15:55:23.804016 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.025248 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-s4j4w" Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.692514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" event={"ID":"96809e41-8656-4095-a2f9-9d69c31efe61","Type":"ContainerStarted","Data":"08aab4b4971debad47d4ac267808042f15a6fe5fd19f2604aac29f911b2e9a19"} Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.693009 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.694431 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" event={"ID":"65722e7d-1557-437c-ae5c-383082933c8c","Type":"ContainerStarted","Data":"e3f6d56457408af3e04ec885200551e1224cd01dcd9182a66e3e052970d01293"} Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.694603 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.695386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" event={"ID":"5e4dd350-9a5b-4626-8b3d-6b9c097b4be1","Type":"ContainerStarted","Data":"0919e987d0fb01899b28e3c93de9fd62a1efdc75d37b210f5c7162119150963c"} Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.695593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.714118 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" podStartSLOduration=3.327138131 podStartE2EDuration="41.714100642s" podCreationTimestamp="2026-03-18 15:54:43 +0000 UTC" firstStartedPulling="2026-03-18 15:54:46.035076511 +0000 UTC m=+1234.904405448" lastFinishedPulling="2026-03-18 15:55:24.422039022 +0000 UTC m=+1273.291367959" observedRunningTime="2026-03-18 15:55:24.709312052 +0000 UTC m=+1273.578640989" watchObservedRunningTime="2026-03-18 15:55:24.714100642 +0000 UTC m=+1273.583429579" Mar 18 15:55:24 crc kubenswrapper[4792]: I0318 15:55:24.729648 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" podStartSLOduration=5.081588031 podStartE2EDuration="42.729598598s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.850589975 +0000 UTC m=+1234.719918912" lastFinishedPulling="2026-03-18 15:55:23.498600542 +0000 UTC m=+1272.367929479" observedRunningTime="2026-03-18 15:55:24.727218458 +0000 UTC m=+1273.596547395" watchObservedRunningTime="2026-03-18 15:55:24.729598598 +0000 UTC m=+1273.598927535" Mar 18 15:55:25 crc kubenswrapper[4792]: I0318 15:55:25.881192 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" podStartSLOduration=5.822173362 podStartE2EDuration="43.881173118s" podCreationTimestamp="2026-03-18 15:54:42 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.44108039 +0000 UTC m=+1234.310409327" lastFinishedPulling="2026-03-18 15:55:23.500080146 +0000 UTC m=+1272.369409083" observedRunningTime="2026-03-18 15:55:24.754578922 +0000 UTC m=+1273.623907869" watchObservedRunningTime="2026-03-18 15:55:25.881173118 +0000 UTC m=+1274.750502045" Mar 18 15:55:26 crc kubenswrapper[4792]: I0318 15:55:26.715871 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" event={"ID":"eb5bab1d-63b4-4ae0-8dfe-734700253a4f","Type":"ContainerStarted","Data":"b3a8def23f652ee81922c7ae92286c5207cf5d80a669a1cc6a5ee6b3b40a841e"} Mar 18 15:55:26 crc kubenswrapper[4792]: I0318 15:55:26.716676 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" Mar 18 15:55:26 crc kubenswrapper[4792]: I0318 15:55:26.733676 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" podStartSLOduration=3.618904554 podStartE2EDuration="43.733658162s" podCreationTimestamp="2026-03-18 15:54:43 +0000 UTC" firstStartedPulling="2026-03-18 15:54:45.814627638 +0000 UTC m=+1234.683956575" lastFinishedPulling="2026-03-18 15:55:25.929381246 +0000 UTC m=+1274.798710183" observedRunningTime="2026-03-18 15:55:26.732396744 +0000 UTC m=+1275.601725681" watchObservedRunningTime="2026-03-18 15:55:26.733658162 +0000 UTC m=+1275.602987099" Mar 18 15:55:29 crc kubenswrapper[4792]: I0318 15:55:29.107218 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 15:55:29 crc kubenswrapper[4792]: I0318 15:55:29.756619 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 15:55:30 crc kubenswrapper[4792]: I0318 15:55:30.322158 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:55:30 crc kubenswrapper[4792]: I0318 15:55:30.322219 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:55:33 crc kubenswrapper[4792]: I0318 15:55:33.171756 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" Mar 18 15:55:33 crc kubenswrapper[4792]: I0318 15:55:33.272018 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" Mar 18 15:55:33 crc kubenswrapper[4792]: I0318 15:55:33.503640 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" Mar 18 15:55:33 crc kubenswrapper[4792]: I0318 15:55:33.516127 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" Mar 18 15:55:33 crc kubenswrapper[4792]: I0318 15:55:33.586092 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" Mar 18 15:55:33 crc kubenswrapper[4792]: I0318 15:55:33.611205 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" Mar 18 15:55:33 crc kubenswrapper[4792]: I0318 15:55:33.996297 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" Mar 18 15:55:34 crc kubenswrapper[4792]: I0318 15:55:34.016155 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 15:55:34 crc kubenswrapper[4792]: I0318 15:55:34.068493 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" Mar 18 15:55:34 crc kubenswrapper[4792]: I0318 15:55:34.114558 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 15:55:34 crc kubenswrapper[4792]: I0318 15:55:34.153227 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.889774 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q4c77"] Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.902749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.908845 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.909080 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.909189 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.909327 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-f2jjw" Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.912548 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q4c77"] Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.980954 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhqpz\" (UniqueName: \"kubernetes.io/projected/08489187-6273-4bfb-b71e-4a0f4ce64103-kube-api-access-zhqpz\") pod \"dnsmasq-dns-675f4bcbfc-q4c77\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:51 crc kubenswrapper[4792]: I0318 15:55:51.981382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08489187-6273-4bfb-b71e-4a0f4ce64103-config\") pod \"dnsmasq-dns-675f4bcbfc-q4c77\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.003383 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6jdn7"] Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.005513 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.012740 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.038586 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6jdn7"] Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.083359 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08489187-6273-4bfb-b71e-4a0f4ce64103-config\") pod \"dnsmasq-dns-675f4bcbfc-q4c77\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.083431 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.083483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhqpz\" (UniqueName: \"kubernetes.io/projected/08489187-6273-4bfb-b71e-4a0f4ce64103-kube-api-access-zhqpz\") pod \"dnsmasq-dns-675f4bcbfc-q4c77\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.083523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-config\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.083597 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zscxm\" (UniqueName: \"kubernetes.io/projected/84026ec8-3948-49fc-ba35-fccab5d78b37-kube-api-access-zscxm\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.084629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08489187-6273-4bfb-b71e-4a0f4ce64103-config\") pod \"dnsmasq-dns-675f4bcbfc-q4c77\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.128237 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhqpz\" (UniqueName: \"kubernetes.io/projected/08489187-6273-4bfb-b71e-4a0f4ce64103-kube-api-access-zhqpz\") pod \"dnsmasq-dns-675f4bcbfc-q4c77\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.184969 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.185085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-config\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.185149 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zscxm\" (UniqueName: \"kubernetes.io/projected/84026ec8-3948-49fc-ba35-fccab5d78b37-kube-api-access-zscxm\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.185990 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.186176 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-config\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.207825 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zscxm\" (UniqueName: \"kubernetes.io/projected/84026ec8-3948-49fc-ba35-fccab5d78b37-kube-api-access-zscxm\") pod \"dnsmasq-dns-78dd6ddcc-6jdn7\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.228018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.335690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.727235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q4c77"] Mar 18 15:55:52 crc kubenswrapper[4792]: W0318 15:55:52.888942 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84026ec8_3948_49fc_ba35_fccab5d78b37.slice/crio-8064d681b03f1fe72fd640241bffdb2ce838dc01e08ba5ce74242a0f75f26615 WatchSource:0}: Error finding container 8064d681b03f1fe72fd640241bffdb2ce838dc01e08ba5ce74242a0f75f26615: Status 404 returned error can't find the container with id 8064d681b03f1fe72fd640241bffdb2ce838dc01e08ba5ce74242a0f75f26615 Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.890877 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6jdn7"] Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.929581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" event={"ID":"84026ec8-3948-49fc-ba35-fccab5d78b37","Type":"ContainerStarted","Data":"8064d681b03f1fe72fd640241bffdb2ce838dc01e08ba5ce74242a0f75f26615"} Mar 18 15:55:52 crc kubenswrapper[4792]: I0318 15:55:52.930374 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" event={"ID":"08489187-6273-4bfb-b71e-4a0f4ce64103","Type":"ContainerStarted","Data":"b02e26824a0858363ee8d6b2590cf9aa7c1830478fc3e641b8889ce7a9d27228"} Mar 18 15:55:54 crc kubenswrapper[4792]: I0318 15:55:54.925151 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q4c77"] Mar 18 15:55:54 crc kubenswrapper[4792]: I0318 15:55:54.952613 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-2qggt"] Mar 18 15:55:54 crc kubenswrapper[4792]: I0318 15:55:54.956326 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:54 crc kubenswrapper[4792]: I0318 15:55:54.963803 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-2qggt"] Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.056531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4bll\" (UniqueName: \"kubernetes.io/projected/9d818297-1bcf-4ad2-805e-b30d10670888-kube-api-access-x4bll\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.056831 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.056875 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-config\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.161555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.161631 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4bll\" (UniqueName: \"kubernetes.io/projected/9d818297-1bcf-4ad2-805e-b30d10670888-kube-api-access-x4bll\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.162622 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.163406 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-config\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.164246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-config\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.200189 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4bll\" (UniqueName: \"kubernetes.io/projected/9d818297-1bcf-4ad2-805e-b30d10670888-kube-api-access-x4bll\") pod \"dnsmasq-dns-5ccc8479f9-2qggt\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.242909 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6jdn7"] Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.279082 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fx8zk"] Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.280487 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.300998 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fx8zk"] Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.313202 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.372516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-config\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.372627 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.372672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njf7l\" (UniqueName: \"kubernetes.io/projected/4665db60-5e6b-4a39-8750-bfabb9ea2631-kube-api-access-njf7l\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.476610 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-config\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.477486 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.477588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njf7l\" (UniqueName: \"kubernetes.io/projected/4665db60-5e6b-4a39-8750-bfabb9ea2631-kube-api-access-njf7l\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.477870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-config\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.479364 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.502440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njf7l\" (UniqueName: \"kubernetes.io/projected/4665db60-5e6b-4a39-8750-bfabb9ea2631-kube-api-access-njf7l\") pod \"dnsmasq-dns-57d769cc4f-fx8zk\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.614320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:55:55 crc kubenswrapper[4792]: I0318 15:55:55.900453 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-2qggt"] Mar 18 15:55:55 crc kubenswrapper[4792]: W0318 15:55:55.905584 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d818297_1bcf_4ad2_805e_b30d10670888.slice/crio-9c22b680f7bad409a4261bdf489c7f20310ff0658890d693a166ef7b6a5c7502 WatchSource:0}: Error finding container 9c22b680f7bad409a4261bdf489c7f20310ff0658890d693a166ef7b6a5c7502: Status 404 returned error can't find the container with id 9c22b680f7bad409a4261bdf489c7f20310ff0658890d693a166ef7b6a5c7502 Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.007533 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" event={"ID":"9d818297-1bcf-4ad2-805e-b30d10670888","Type":"ContainerStarted","Data":"9c22b680f7bad409a4261bdf489c7f20310ff0658890d693a166ef7b6a5c7502"} Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.069167 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.071033 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.078199 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.078355 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.078497 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.078579 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.078707 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.078813 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.078938 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-brgwk" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.111036 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.173256 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fx8zk"] Mar 18 15:55:56 crc kubenswrapper[4792]: W0318 15:55:56.173999 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4665db60_5e6b_4a39_8750_bfabb9ea2631.slice/crio-32f1d902126d995012fe1884a137cd1ddd01dcf965a2709a1ab2b0ad4d7d1ba8 WatchSource:0}: Error finding container 32f1d902126d995012fe1884a137cd1ddd01dcf965a2709a1ab2b0ad4d7d1ba8: Status 404 returned error can't find the container with id 32f1d902126d995012fe1884a137cd1ddd01dcf965a2709a1ab2b0ad4d7d1ba8 Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199264 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8lkm\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-kube-api-access-h8lkm\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0987841-aa1a-4130-a8e9-aeab1ba7aade-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199352 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199380 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199503 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199682 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.199728 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0987841-aa1a-4130-a8e9-aeab1ba7aade-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0987841-aa1a-4130-a8e9-aeab1ba7aade-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0987841-aa1a-4130-a8e9-aeab1ba7aade-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.308795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8lkm\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-kube-api-access-h8lkm\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.311278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.311605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.311690 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.312431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.312463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.315478 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.315521 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6aa12682446948c208a7df2de2f3e0d6fe0df3f4db75202487c5c3b9a696ecf4/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.316898 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0987841-aa1a-4130-a8e9-aeab1ba7aade-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.330449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0987841-aa1a-4130-a8e9-aeab1ba7aade-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.335588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8lkm\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-kube-api-access-h8lkm\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.340742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.342399 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.391228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.412366 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.414413 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.426754 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.427218 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.427389 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zv645" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.428097 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.428314 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.428466 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.428624 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.430066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.459901 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.461857 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.481205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.499417 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.507319 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.509081 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519247 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519355 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519383 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519512 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wltw\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-kube-api-access-7wltw\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519541 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-config-data\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519581 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.519638 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.539089 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.621399 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-config-data\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.621505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.621648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-server-conf\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.621713 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzq9b\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-kube-api-access-vzq9b\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.621775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.621800 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.621823 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-pod-info\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.622453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.622929 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-config-data\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.623339 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.623433 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.623482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-server-conf\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.623527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.623584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.624364 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625547 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-config-data\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be818cb3-6cf1-4945-a96e-25c124ed1098-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.625942 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-config-data\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626321 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626680 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626743 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626803 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be818cb3-6cf1-4945-a96e-25c124ed1098-pod-info\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626820 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626835 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626923 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.626997 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.627022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.627187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.627238 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wltw\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-kube-api-access-7wltw\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.627256 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5l95\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-kube-api-access-f5l95\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.627803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.628328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.631015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.631416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.631838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.632788 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.632822 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1628534424672959792f013c755b2348fd03dde948952a742875a82406539b79/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.646098 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wltw\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-kube-api-access-7wltw\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.693672 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.731831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.731934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-config-data\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.731955 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732030 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be818cb3-6cf1-4945-a96e-25c124ed1098-pod-info\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732094 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732117 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732149 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732176 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5l95\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-kube-api-access-f5l95\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732295 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-server-conf\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732343 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzq9b\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-kube-api-access-vzq9b\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-pod-info\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732464 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-server-conf\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732500 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732536 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732597 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-config-data\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be818cb3-6cf1-4945-a96e-25c124ed1098-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.732669 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.733563 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.734559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-server-conf\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.735826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-config-data\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.735848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.736142 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.736381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.736415 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-config-data\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.746623 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.747688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be818cb3-6cf1-4945-a96e-25c124ed1098-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.750443 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be818cb3-6cf1-4945-a96e-25c124ed1098-pod-info\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.757173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.757352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-server-conf\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.757844 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.757904 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/87ee01b6bedb30e6fd03d8511a9d9616a8c5f390321a6aaea3aadb41dde33bb0/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.757965 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-pod-info\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.758562 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.758847 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.759193 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.759241 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e806a32f563ffb605d360d449207970d828596911ba7075052f9c981032e8d8/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.759302 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.759912 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.765508 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.768734 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5l95\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-kube-api-access-f5l95\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.768895 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.770202 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzq9b\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-kube-api-access-vzq9b\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.828297 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " pod="openstack/rabbitmq-server-2" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.849720 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " pod="openstack/rabbitmq-server-1" Mar 18 15:55:56 crc kubenswrapper[4792]: I0318 15:55:56.916468 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.059121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" event={"ID":"4665db60-5e6b-4a39-8750-bfabb9ea2631","Type":"ContainerStarted","Data":"32f1d902126d995012fe1884a137cd1ddd01dcf965a2709a1ab2b0ad4d7d1ba8"} Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.090407 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.155992 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 15:55:57 crc kubenswrapper[4792]: W0318 15:55:57.242947 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0987841_aa1a_4130_a8e9_aeab1ba7aade.slice/crio-68a73a64d4dffe2611d6dc7d272e9515fc3d79e622f71e037de3b3d13ea8ffeb WatchSource:0}: Error finding container 68a73a64d4dffe2611d6dc7d272e9515fc3d79e622f71e037de3b3d13ea8ffeb: Status 404 returned error can't find the container with id 68a73a64d4dffe2611d6dc7d272e9515fc3d79e622f71e037de3b3d13ea8ffeb Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.443764 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.479311 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.511936 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.512520 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.514634 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wdq2j" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.518415 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.518651 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.520891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.538921 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.571326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.571442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.571467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-kolla-config\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.571584 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-config-data-default\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.571615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkwq\" (UniqueName: \"kubernetes.io/projected/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-kube-api-access-drkwq\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.571680 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.572450 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.572550 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.674569 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-config-data-default\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.674960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkwq\" (UniqueName: \"kubernetes.io/projected/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-kube-api-access-drkwq\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.675095 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.675178 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.675253 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.675288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.675362 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.675383 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-kolla-config\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.676329 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-kolla-config\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.677450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-config-data-default\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.679292 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.679518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.684322 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.709948 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.711846 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.711894 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf05bf56f4c648d7028858b3b21a64ab1d2884f5551c1d85985be872e4491b8f/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.719435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkwq\" (UniqueName: \"kubernetes.io/projected/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-kube-api-access-drkwq\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.720574 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38dfbae-0508-4b57-b5d8-d47fcdd35fd6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: W0318 15:55:57.763603 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe818cb3_6cf1_4945_a96e_25c124ed1098.slice/crio-64bf54b85b9e808b291fec80bc6b702efa5b395b9ee767c9f9bc8ee86a106adf WatchSource:0}: Error finding container 64bf54b85b9e808b291fec80bc6b702efa5b395b9ee767c9f9bc8ee86a106adf: Status 404 returned error can't find the container with id 64bf54b85b9e808b291fec80bc6b702efa5b395b9ee767c9f9bc8ee86a106adf Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.883789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfdb4320-4d22-437d-ac37-8f79c98010e3\") pod \"openstack-galera-0\" (UID: \"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6\") " pod="openstack/openstack-galera-0" Mar 18 15:55:57 crc kubenswrapper[4792]: I0318 15:55:57.938203 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 15:55:57 crc kubenswrapper[4792]: W0318 15:55:57.979141 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753d5ec4_134d_48f9_ad6c_aa17f8856b5a.slice/crio-3c6dea45e282aa58d2079f7b2a2e49d09bf9a589d398a47886cdab30f0b4d4ba WatchSource:0}: Error finding container 3c6dea45e282aa58d2079f7b2a2e49d09bf9a589d398a47886cdab30f0b4d4ba: Status 404 returned error can't find the container with id 3c6dea45e282aa58d2079f7b2a2e49d09bf9a589d398a47886cdab30f0b4d4ba Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.078444 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"753d5ec4-134d-48f9-ad6c-aa17f8856b5a","Type":"ContainerStarted","Data":"3c6dea45e282aa58d2079f7b2a2e49d09bf9a589d398a47886cdab30f0b4d4ba"} Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.080856 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"be818cb3-6cf1-4945-a96e-25c124ed1098","Type":"ContainerStarted","Data":"64bf54b85b9e808b291fec80bc6b702efa5b395b9ee767c9f9bc8ee86a106adf"} Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.082883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1","Type":"ContainerStarted","Data":"3c6df70d6a00540530ae4d5ae33e5ae998e9eaa9e64a681b6bf004040ca3e165"} Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.091980 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0987841-aa1a-4130-a8e9-aeab1ba7aade","Type":"ContainerStarted","Data":"68a73a64d4dffe2611d6dc7d272e9515fc3d79e622f71e037de3b3d13ea8ffeb"} Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.119288 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.852235 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.854890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.857474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.858064 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.858227 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.858494 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9z7xx" Mar 18 15:55:58 crc kubenswrapper[4792]: W0318 15:55:58.891314 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb38dfbae_0508_4b57_b5d8_d47fcdd35fd6.slice/crio-b2b989383ed3cc6a8864d2a0e450e00555258142f3dcb5576759f1f63ea3ce0a WatchSource:0}: Error finding container b2b989383ed3cc6a8864d2a0e450e00555258142f3dcb5576759f1f63ea3ce0a: Status 404 returned error can't find the container with id b2b989383ed3cc6a8864d2a0e450e00555258142f3dcb5576759f1f63ea3ce0a Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.920944 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:55:58 crc kubenswrapper[4792]: I0318 15:55:58.938169 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.008723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.009439 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.009544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-846n9\" (UniqueName: \"kubernetes.io/projected/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-kube-api-access-846n9\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.009580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.009645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.009696 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.009959 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.010129 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.082093 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.083446 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.085990 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.086131 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.086418 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8ntzx" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.103064 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116357 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116425 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116550 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116593 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-846n9\" (UniqueName: \"kubernetes.io/projected/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-kube-api-access-846n9\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116721 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.116768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.117262 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.118271 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.122672 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.123631 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.130042 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.150575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.152426 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.152468 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a2a5207252a54982068cef0b7721ba283367984d9c3120cc231df6b0d665194/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.160112 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-846n9\" (UniqueName: \"kubernetes.io/projected/a64098b6-eb41-40ef-8d9b-6dd69c107ee2-kube-api-access-846n9\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.219770 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cc19e41-291b-4aa8-b862-2efc890cea99-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.220120 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc19e41-291b-4aa8-b862-2efc890cea99-config-data\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.220233 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6smfw\" (UniqueName: \"kubernetes.io/projected/4cc19e41-291b-4aa8-b862-2efc890cea99-kube-api-access-6smfw\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.220266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cc19e41-291b-4aa8-b862-2efc890cea99-kolla-config\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.220526 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc19e41-291b-4aa8-b862-2efc890cea99-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.252385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba22be8-2fa7-403d-81b9-77d4a6f13d94\") pod \"openstack-cell1-galera-0\" (UID: \"a64098b6-eb41-40ef-8d9b-6dd69c107ee2\") " pod="openstack/openstack-cell1-galera-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.274013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6","Type":"ContainerStarted","Data":"b2b989383ed3cc6a8864d2a0e450e00555258142f3dcb5576759f1f63ea3ce0a"} Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.323910 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cc19e41-291b-4aa8-b862-2efc890cea99-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.324018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc19e41-291b-4aa8-b862-2efc890cea99-config-data\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.324143 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6smfw\" (UniqueName: \"kubernetes.io/projected/4cc19e41-291b-4aa8-b862-2efc890cea99-kube-api-access-6smfw\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.324191 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cc19e41-291b-4aa8-b862-2efc890cea99-kolla-config\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.324289 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc19e41-291b-4aa8-b862-2efc890cea99-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.326640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cc19e41-291b-4aa8-b862-2efc890cea99-kolla-config\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.327037 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cc19e41-291b-4aa8-b862-2efc890cea99-config-data\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.345246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cc19e41-291b-4aa8-b862-2efc890cea99-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.345689 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc19e41-291b-4aa8-b862-2efc890cea99-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.345711 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6smfw\" (UniqueName: \"kubernetes.io/projected/4cc19e41-291b-4aa8-b862-2efc890cea99-kube-api-access-6smfw\") pod \"memcached-0\" (UID: \"4cc19e41-291b-4aa8-b862-2efc890cea99\") " pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.415300 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 15:55:59 crc kubenswrapper[4792]: I0318 15:55:59.551305 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.143814 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564156-j4qrz"] Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.150471 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-j4qrz" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.158732 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.161243 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.175275 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.186083 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-j4qrz"] Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.265981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tth8d\" (UniqueName: \"kubernetes.io/projected/7a319b20-00fe-4182-8fdb-1f71c5f4f655-kube-api-access-tth8d\") pod \"auto-csr-approver-29564156-j4qrz\" (UID: \"7a319b20-00fe-4182-8fdb-1f71c5f4f655\") " pod="openshift-infra/auto-csr-approver-29564156-j4qrz" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.321791 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.321854 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.369630 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.369930 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tth8d\" (UniqueName: \"kubernetes.io/projected/7a319b20-00fe-4182-8fdb-1f71c5f4f655-kube-api-access-tth8d\") pod \"auto-csr-approver-29564156-j4qrz\" (UID: \"7a319b20-00fe-4182-8fdb-1f71c5f4f655\") " pod="openshift-infra/auto-csr-approver-29564156-j4qrz" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.410883 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tth8d\" (UniqueName: \"kubernetes.io/projected/7a319b20-00fe-4182-8fdb-1f71c5f4f655-kube-api-access-tth8d\") pod \"auto-csr-approver-29564156-j4qrz\" (UID: \"7a319b20-00fe-4182-8fdb-1f71c5f4f655\") " pod="openshift-infra/auto-csr-approver-29564156-j4qrz" Mar 18 15:56:00 crc kubenswrapper[4792]: W0318 15:56:00.411733 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cc19e41_291b_4aa8_b862_2efc890cea99.slice/crio-5cb19f412ca37dec18c2007d069d46c7bca8f36434d38918660a9cf2f137e62e WatchSource:0}: Error finding container 5cb19f412ca37dec18c2007d069d46c7bca8f36434d38918660a9cf2f137e62e: Status 404 returned error can't find the container with id 5cb19f412ca37dec18c2007d069d46c7bca8f36434d38918660a9cf2f137e62e Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.509235 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-j4qrz" Mar 18 15:56:00 crc kubenswrapper[4792]: I0318 15:56:00.659281 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 15:56:01 crc kubenswrapper[4792]: I0318 15:56:01.354244 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4cc19e41-291b-4aa8-b862-2efc890cea99","Type":"ContainerStarted","Data":"5cb19f412ca37dec18c2007d069d46c7bca8f36434d38918660a9cf2f137e62e"} Mar 18 15:56:01 crc kubenswrapper[4792]: W0318 15:56:01.624394 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64098b6_eb41_40ef_8d9b_6dd69c107ee2.slice/crio-4c8e0287540959c03235f3607360f73df7d2e6e7e2a61565ae35c75d1ec24b0a WatchSource:0}: Error finding container 4c8e0287540959c03235f3607360f73df7d2e6e7e2a61565ae35c75d1ec24b0a: Status 404 returned error can't find the container with id 4c8e0287540959c03235f3607360f73df7d2e6e7e2a61565ae35c75d1ec24b0a Mar 18 15:56:01 crc kubenswrapper[4792]: I0318 15:56:01.677141 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:56:01 crc kubenswrapper[4792]: I0318 15:56:01.690125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:56:01 crc kubenswrapper[4792]: I0318 15:56:01.702526 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-k4mpq" Mar 18 15:56:01 crc kubenswrapper[4792]: I0318 15:56:01.720810 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:56:01 crc kubenswrapper[4792]: I0318 15:56:01.831352 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqkc2\" (UniqueName: \"kubernetes.io/projected/018a5f60-5274-4779-912f-4d7c32b6bfe5-kube-api-access-rqkc2\") pod \"kube-state-metrics-0\" (UID: \"018a5f60-5274-4779-912f-4d7c32b6bfe5\") " pod="openstack/kube-state-metrics-0" Mar 18 15:56:01 crc kubenswrapper[4792]: I0318 15:56:01.932994 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqkc2\" (UniqueName: \"kubernetes.io/projected/018a5f60-5274-4779-912f-4d7c32b6bfe5-kube-api-access-rqkc2\") pod \"kube-state-metrics-0\" (UID: \"018a5f60-5274-4779-912f-4d7c32b6bfe5\") " pod="openstack/kube-state-metrics-0" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.016476 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqkc2\" (UniqueName: \"kubernetes.io/projected/018a5f60-5274-4779-912f-4d7c32b6bfe5-kube-api-access-rqkc2\") pod \"kube-state-metrics-0\" (UID: \"018a5f60-5274-4779-912f-4d7c32b6bfe5\") " pod="openstack/kube-state-metrics-0" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.032351 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.386119 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a64098b6-eb41-40ef-8d9b-6dd69c107ee2","Type":"ContainerStarted","Data":"4c8e0287540959c03235f3607360f73df7d2e6e7e2a61565ae35c75d1ec24b0a"} Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.605261 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5"] Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.620838 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.626605 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.628604 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-bdbdc" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.689952 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7nh\" (UniqueName: \"kubernetes.io/projected/675f6ffb-b144-4efc-b47a-81c748cb4765-kube-api-access-gd7nh\") pod \"observability-ui-dashboards-7f87b9b85b-ct7p5\" (UID: \"675f6ffb-b144-4efc-b47a-81c748cb4765\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.693983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675f6ffb-b144-4efc-b47a-81c748cb4765-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-ct7p5\" (UID: \"675f6ffb-b144-4efc-b47a-81c748cb4765\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.694506 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5"] Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.774076 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-j4qrz"] Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.800386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7nh\" (UniqueName: \"kubernetes.io/projected/675f6ffb-b144-4efc-b47a-81c748cb4765-kube-api-access-gd7nh\") pod \"observability-ui-dashboards-7f87b9b85b-ct7p5\" (UID: \"675f6ffb-b144-4efc-b47a-81c748cb4765\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.800481 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675f6ffb-b144-4efc-b47a-81c748cb4765-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-ct7p5\" (UID: \"675f6ffb-b144-4efc-b47a-81c748cb4765\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:02 crc kubenswrapper[4792]: E0318 15:56:02.800820 4792 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Mar 18 15:56:02 crc kubenswrapper[4792]: E0318 15:56:02.800909 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/675f6ffb-b144-4efc-b47a-81c748cb4765-serving-cert podName:675f6ffb-b144-4efc-b47a-81c748cb4765 nodeName:}" failed. No retries permitted until 2026-03-18 15:56:03.300884604 +0000 UTC m=+1312.170213551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/675f6ffb-b144-4efc-b47a-81c748cb4765-serving-cert") pod "observability-ui-dashboards-7f87b9b85b-ct7p5" (UID: "675f6ffb-b144-4efc-b47a-81c748cb4765") : secret "observability-ui-dashboards" not found Mar 18 15:56:02 crc kubenswrapper[4792]: W0318 15:56:02.809436 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a319b20_00fe_4182_8fdb_1f71c5f4f655.slice/crio-f9d99ab0e06ce1c83cc0b96b69874f08dc9f7c7bf9aad33c7e3a53e78633f9bb WatchSource:0}: Error finding container f9d99ab0e06ce1c83cc0b96b69874f08dc9f7c7bf9aad33c7e3a53e78633f9bb: Status 404 returned error can't find the container with id f9d99ab0e06ce1c83cc0b96b69874f08dc9f7c7bf9aad33c7e3a53e78633f9bb Mar 18 15:56:02 crc kubenswrapper[4792]: I0318 15:56:02.846038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7nh\" (UniqueName: \"kubernetes.io/projected/675f6ffb-b144-4efc-b47a-81c748cb4765-kube-api-access-gd7nh\") pod \"observability-ui-dashboards-7f87b9b85b-ct7p5\" (UID: \"675f6ffb-b144-4efc-b47a-81c748cb4765\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.007037 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-855f5dc7f-qnkcz"] Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.009003 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.036000 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-855f5dc7f-qnkcz"] Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.074778 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.088136 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.098745 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.099047 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.099175 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.101434 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.101911 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-srpm2" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.102067 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.102170 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.106129 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.126923 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.147444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e8a660f-2f46-41b3-badb-2d1164cea860-console-oauth-config\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.147694 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-service-ca\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.147824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-oauth-serving-cert\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.147908 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8a660f-2f46-41b3-badb-2d1164cea860-console-serving-cert\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.148000 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-console-config\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.148060 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfk9h\" (UniqueName: \"kubernetes.io/projected/0e8a660f-2f46-41b3-badb-2d1164cea860-kube-api-access-qfk9h\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.149828 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-trusted-ca-bundle\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.234288 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267372 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-console-config\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267503 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfk9h\" (UniqueName: \"kubernetes.io/projected/0e8a660f-2f46-41b3-badb-2d1164cea860-kube-api-access-qfk9h\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-trusted-ca-bundle\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267611 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267671 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.267691 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e8a660f-2f46-41b3-badb-2d1164cea860-console-oauth-config\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxlr\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-kube-api-access-4cxlr\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-service-ca\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269771 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-oauth-serving-cert\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8a660f-2f46-41b3-badb-2d1164cea860-console-serving-cert\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.269883 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.270437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-trusted-ca-bundle\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.270714 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-service-ca\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.271996 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-oauth-serving-cert\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.272621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e8a660f-2f46-41b3-badb-2d1164cea860-console-config\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.277418 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e8a660f-2f46-41b3-badb-2d1164cea860-console-oauth-config\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.287252 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8a660f-2f46-41b3-badb-2d1164cea860-console-serving-cert\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.290342 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfk9h\" (UniqueName: \"kubernetes.io/projected/0e8a660f-2f46-41b3-badb-2d1164cea860-kube-api-access-qfk9h\") pod \"console-855f5dc7f-qnkcz\" (UID: \"0e8a660f-2f46-41b3-badb-2d1164cea860\") " pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.377492 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.378063 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.378189 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.378287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.378754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.379821 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.379112 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.381837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675f6ffb-b144-4efc-b47a-81c748cb4765-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-ct7p5\" (UID: \"675f6ffb-b144-4efc-b47a-81c748cb4765\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.382046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxlr\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-kube-api-access-4cxlr\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.382359 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.382807 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.382816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.383015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.383074 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.390909 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.391183 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.393486 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/45fd3dc2e95d3e59015b40d9f64664aa445807a0df4a2d1f8578969394abae77/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.439245 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.441505 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675f6ffb-b144-4efc-b47a-81c748cb4765-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-ct7p5\" (UID: \"675f6ffb-b144-4efc-b47a-81c748cb4765\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.442687 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.442839 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.446084 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.451246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.451859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxlr\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-kube-api-access-4cxlr\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.464912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-j4qrz" event={"ID":"7a319b20-00fe-4182-8fdb-1f71c5f4f655","Type":"ContainerStarted","Data":"f9d99ab0e06ce1c83cc0b96b69874f08dc9f7c7bf9aad33c7e3a53e78633f9bb"} Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.519163 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.710922 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" Mar 18 15:56:03 crc kubenswrapper[4792]: I0318 15:56:03.725733 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.311480 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m977k"] Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.317959 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.321715 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-pl42b" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.321791 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.324338 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.353138 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m977k"] Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.423522 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-run\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.423613 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90ccac6-a973-4572-834a-f7215cfc72a7-ovn-controller-tls-certs\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.423683 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90ccac6-a973-4572-834a-f7215cfc72a7-combined-ca-bundle\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.423728 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-run-ovn\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.423825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t988t\" (UniqueName: \"kubernetes.io/projected/b90ccac6-a973-4572-834a-f7215cfc72a7-kube-api-access-t988t\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.423868 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-log-ovn\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.423915 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90ccac6-a973-4572-834a-f7215cfc72a7-scripts\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.488145 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-6xllm"] Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.490768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.503693 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6xllm"] Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.526152 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90ccac6-a973-4572-834a-f7215cfc72a7-ovn-controller-tls-certs\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.526240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90ccac6-a973-4572-834a-f7215cfc72a7-combined-ca-bundle\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.526286 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-run-ovn\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.526379 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t988t\" (UniqueName: \"kubernetes.io/projected/b90ccac6-a973-4572-834a-f7215cfc72a7-kube-api-access-t988t\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.526422 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-log-ovn\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.526468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90ccac6-a973-4572-834a-f7215cfc72a7-scripts\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.526539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-run\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.528944 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-run\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.529473 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-log-ovn\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.530064 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b90ccac6-a973-4572-834a-f7215cfc72a7-var-run-ovn\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.531936 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90ccac6-a973-4572-834a-f7215cfc72a7-scripts\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.548778 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90ccac6-a973-4572-834a-f7215cfc72a7-ovn-controller-tls-certs\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.572384 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90ccac6-a973-4572-834a-f7215cfc72a7-combined-ca-bundle\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.577096 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t988t\" (UniqueName: \"kubernetes.io/projected/b90ccac6-a973-4572-834a-f7215cfc72a7-kube-api-access-t988t\") pod \"ovn-controller-m977k\" (UID: \"b90ccac6-a973-4572-834a-f7215cfc72a7\") " pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.630484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-log\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.630642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-run\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.630715 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-lib\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.630801 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-etc-ovs\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.630832 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-scripts\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.630901 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8wl\" (UniqueName: \"kubernetes.io/projected/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-kube-api-access-vx8wl\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.653131 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.733742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-scripts\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.733829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8wl\" (UniqueName: \"kubernetes.io/projected/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-kube-api-access-vx8wl\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-log\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734086 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-run\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734164 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-lib\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-etc-ovs\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-run\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-lib\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734881 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-var-log\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.734899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-etc-ovs\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.736486 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-scripts\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.762859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8wl\" (UniqueName: \"kubernetes.io/projected/e0f3bd33-05e2-4174-a371-af75ef9fdb7d-kube-api-access-vx8wl\") pod \"ovn-controller-ovs-6xllm\" (UID: \"e0f3bd33-05e2-4174-a371-af75ef9fdb7d\") " pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:04 crc kubenswrapper[4792]: I0318 15:56:04.834160 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.294152 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.298806 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.302073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-t9stt" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.305747 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.306854 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.308122 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.308262 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.308349 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.388388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.388477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.388509 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b19f3e39-4198-4eef-bbe8-67e28fcef034-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.388556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b19f3e39-4198-4eef-bbe8-67e28fcef034-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.388714 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19f3e39-4198-4eef-bbe8-67e28fcef034-config\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.388818 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.388887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.389095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjpx2\" (UniqueName: \"kubernetes.io/projected/b19f3e39-4198-4eef-bbe8-67e28fcef034-kube-api-access-kjpx2\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.491663 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjpx2\" (UniqueName: \"kubernetes.io/projected/b19f3e39-4198-4eef-bbe8-67e28fcef034-kube-api-access-kjpx2\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.491799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.491869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.491899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b19f3e39-4198-4eef-bbe8-67e28fcef034-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.491999 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b19f3e39-4198-4eef-bbe8-67e28fcef034-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.492588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b19f3e39-4198-4eef-bbe8-67e28fcef034-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.493657 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b19f3e39-4198-4eef-bbe8-67e28fcef034-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.493822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19f3e39-4198-4eef-bbe8-67e28fcef034-config\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.493871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.493919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.495893 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19f3e39-4198-4eef-bbe8-67e28fcef034-config\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.500827 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.500894 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d580a88c4df59d7a1f14c0c76a6add5db0513a5299c09888b6188e82b4f981dd/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.519343 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.519457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.519578 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19f3e39-4198-4eef-bbe8-67e28fcef034-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.540442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjpx2\" (UniqueName: \"kubernetes.io/projected/b19f3e39-4198-4eef-bbe8-67e28fcef034-kube-api-access-kjpx2\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.602862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcfa83f-cb21-4c47-8396-af52b960cea6\") pod \"ovsdbserver-nb-0\" (UID: \"b19f3e39-4198-4eef-bbe8-67e28fcef034\") " pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:06 crc kubenswrapper[4792]: I0318 15:56:06.656574 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.186149 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.188834 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.190600 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.190812 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ffgjv" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.191387 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.191636 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.194834 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.230787 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.230838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.230907 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.230996 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f119c5e-bb19-41d0-b87c-4962192e94e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.231034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxmdh\" (UniqueName: \"kubernetes.io/projected/5f119c5e-bb19-41d0-b87c-4962192e94e5-kube-api-access-wxmdh\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.231061 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f119c5e-bb19-41d0-b87c-4962192e94e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.231148 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-923f669e-b208-47df-ae56-657e53e71cab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-923f669e-b208-47df-ae56-657e53e71cab\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.231207 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f119c5e-bb19-41d0-b87c-4962192e94e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.332694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.332759 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.332864 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.332914 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f119c5e-bb19-41d0-b87c-4962192e94e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.332934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxmdh\" (UniqueName: \"kubernetes.io/projected/5f119c5e-bb19-41d0-b87c-4962192e94e5-kube-api-access-wxmdh\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.332956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f119c5e-bb19-41d0-b87c-4962192e94e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.333013 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-923f669e-b208-47df-ae56-657e53e71cab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-923f669e-b208-47df-ae56-657e53e71cab\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.333044 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f119c5e-bb19-41d0-b87c-4962192e94e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.333386 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f119c5e-bb19-41d0-b87c-4962192e94e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.334130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f119c5e-bb19-41d0-b87c-4962192e94e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.334612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f119c5e-bb19-41d0-b87c-4962192e94e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.335743 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.335791 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-923f669e-b208-47df-ae56-657e53e71cab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-923f669e-b208-47df-ae56-657e53e71cab\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6ed1af4827678659f79c52b6417581c26e5acfa5a2e0932d5c9b573a3b7f4ba6/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.340244 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.340463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.347724 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f119c5e-bb19-41d0-b87c-4962192e94e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.350902 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxmdh\" (UniqueName: \"kubernetes.io/projected/5f119c5e-bb19-41d0-b87c-4962192e94e5-kube-api-access-wxmdh\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.371081 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-923f669e-b208-47df-ae56-657e53e71cab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-923f669e-b208-47df-ae56-657e53e71cab\") pod \"ovsdbserver-sb-0\" (UID: \"5f119c5e-bb19-41d0-b87c-4962192e94e5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:08 crc kubenswrapper[4792]: I0318 15:56:08.515258 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:11 crc kubenswrapper[4792]: I0318 15:56:11.641574 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"018a5f60-5274-4779-912f-4d7c32b6bfe5","Type":"ContainerStarted","Data":"c06a89c1ad7d6139f4bd82b52d85f44eef2a1c0fce1496adc31fd75fb82d6a0a"} Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.445661 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.446263 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wltw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(a3217a72-3ad4-4bb5-bf86-c1daa2e409c1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.448146 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.465801 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.466029 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h8lkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(e0987841-aa1a-4130-a8e9-aeab1ba7aade): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.467251 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.512724 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.512964 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vzq9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-2_openstack(be818cb3-6cf1-4945-a96e-25c124ed1098): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.518391 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-2" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.697173 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-2" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.697233 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" Mar 18 15:56:17 crc kubenswrapper[4792]: E0318 15:56:17.697494 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" Mar 18 15:56:17 crc kubenswrapper[4792]: I0318 15:56:17.878850 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:56:24 crc kubenswrapper[4792]: E0318 15:56:24.333397 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Mar 18 15:56:24 crc kubenswrapper[4792]: E0318 15:56:24.334095 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n647h7h677h68fh5cbh5bdh688h5chb5h655h5c4h68bh5fdh5b7h5c6h5c4h58bh647h556h546h555h5c7hcbh5d5h9h555h9ch59fh5cbh5d9hf6h5d5q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6smfw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(4cc19e41-291b-4aa8-b862-2efc890cea99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:24 crc kubenswrapper[4792]: E0318 15:56:24.335834 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="4cc19e41-291b-4aa8-b862-2efc890cea99" Mar 18 15:56:24 crc kubenswrapper[4792]: W0318 15:56:24.366450 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8f9d61_a4a8_4579_9d91_722dfe7aa68a.slice/crio-7d2659b56726052c639620b2f307854a1360fef0fb4eb51726208730f736638b WatchSource:0}: Error finding container 7d2659b56726052c639620b2f307854a1360fef0fb4eb51726208730f736638b: Status 404 returned error can't find the container with id 7d2659b56726052c639620b2f307854a1360fef0fb4eb51726208730f736638b Mar 18 15:56:24 crc kubenswrapper[4792]: I0318 15:56:24.769377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerStarted","Data":"7d2659b56726052c639620b2f307854a1360fef0fb4eb51726208730f736638b"} Mar 18 15:56:24 crc kubenswrapper[4792]: E0318 15:56:24.771751 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="4cc19e41-291b-4aa8-b862-2efc890cea99" Mar 18 15:56:24 crc kubenswrapper[4792]: I0318 15:56:24.786515 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5"] Mar 18 15:56:25 crc kubenswrapper[4792]: I0318 15:56:25.334919 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.639057 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.639232 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhqpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-q4c77_openstack(08489187-6273-4bfb-b71e-4a0f4ce64103): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.640421 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" podUID="08489187-6273-4bfb-b71e-4a0f4ce64103" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.679014 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.679499 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zscxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-6jdn7_openstack(84026ec8-3948-49fc-ba35-fccab5d78b37): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.680907 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" podUID="84026ec8-3948-49fc-ba35-fccab5d78b37" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.696719 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.696901 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4bll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-2qggt_openstack(9d818297-1bcf-4ad2-805e-b30d10670888): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.698452 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" podUID="9d818297-1bcf-4ad2-805e-b30d10670888" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.704692 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.704870 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njf7l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-fx8zk_openstack(4665db60-5e6b-4a39-8750-bfabb9ea2631): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.706219 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" podUID="4665db60-5e6b-4a39-8750-bfabb9ea2631" Mar 18 15:56:25 crc kubenswrapper[4792]: I0318 15:56:25.789178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f119c5e-bb19-41d0-b87c-4962192e94e5","Type":"ContainerStarted","Data":"5b22fae3c7bb2bb275835e28194d7bc06fc4f5525046ab6cc82bb7d8e4fd5efb"} Mar 18 15:56:25 crc kubenswrapper[4792]: I0318 15:56:25.791470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" event={"ID":"675f6ffb-b144-4efc-b47a-81c748cb4765","Type":"ContainerStarted","Data":"9aee4eb241cbc2b0ee24c2d5c7985fba203c0639432bc4489475955d969708f4"} Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.793408 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" podUID="9d818297-1bcf-4ad2-805e-b30d10670888" Mar 18 15:56:25 crc kubenswrapper[4792]: E0318 15:56:25.793763 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" podUID="4665db60-5e6b-4a39-8750-bfabb9ea2631" Mar 18 15:56:26 crc kubenswrapper[4792]: I0318 15:56:26.416645 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-855f5dc7f-qnkcz"] Mar 18 15:56:26 crc kubenswrapper[4792]: I0318 15:56:26.626420 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m977k"] Mar 18 15:56:26 crc kubenswrapper[4792]: I0318 15:56:26.713738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 15:56:26 crc kubenswrapper[4792]: I0318 15:56:26.806017 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-855f5dc7f-qnkcz" event={"ID":"0e8a660f-2f46-41b3-badb-2d1164cea860","Type":"ContainerStarted","Data":"9c841900de2cd27671cf36553c1bdb28defc3267478b663a98e7affdd12a2a6d"} Mar 18 15:56:26 crc kubenswrapper[4792]: I0318 15:56:26.845369 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6xllm"] Mar 18 15:56:27 crc kubenswrapper[4792]: W0318 15:56:27.111416 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb90ccac6_a973_4572_834a_f7215cfc72a7.slice/crio-62d949a38fede235d36424ee32be0c12d4fd89d246d0092a248e190a566c0074 WatchSource:0}: Error finding container 62d949a38fede235d36424ee32be0c12d4fd89d246d0092a248e190a566c0074: Status 404 returned error can't find the container with id 62d949a38fede235d36424ee32be0c12d4fd89d246d0092a248e190a566c0074 Mar 18 15:56:27 crc kubenswrapper[4792]: W0318 15:56:27.119461 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0f3bd33_05e2_4174_a371_af75ef9fdb7d.slice/crio-b99681985d9d945a1d35663e3c847b49ead58720d5c5e4490d055cde8d2ec42e WatchSource:0}: Error finding container b99681985d9d945a1d35663e3c847b49ead58720d5c5e4490d055cde8d2ec42e: Status 404 returned error can't find the container with id b99681985d9d945a1d35663e3c847b49ead58720d5c5e4490d055cde8d2ec42e Mar 18 15:56:27 crc kubenswrapper[4792]: W0318 15:56:27.130163 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb19f3e39_4198_4eef_bbe8_67e28fcef034.slice/crio-fc57a9bbdedde4843b658be62d45c97e98a25200945f7ae1b36c43e724c8b8f4 WatchSource:0}: Error finding container fc57a9bbdedde4843b658be62d45c97e98a25200945f7ae1b36c43e724c8b8f4: Status 404 returned error can't find the container with id fc57a9bbdedde4843b658be62d45c97e98a25200945f7ae1b36c43e724c8b8f4 Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.230944 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.240017 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.391740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhqpz\" (UniqueName: \"kubernetes.io/projected/08489187-6273-4bfb-b71e-4a0f4ce64103-kube-api-access-zhqpz\") pod \"08489187-6273-4bfb-b71e-4a0f4ce64103\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.391990 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-dns-svc\") pod \"84026ec8-3948-49fc-ba35-fccab5d78b37\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.392184 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08489187-6273-4bfb-b71e-4a0f4ce64103-config\") pod \"08489187-6273-4bfb-b71e-4a0f4ce64103\" (UID: \"08489187-6273-4bfb-b71e-4a0f4ce64103\") " Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.392225 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zscxm\" (UniqueName: \"kubernetes.io/projected/84026ec8-3948-49fc-ba35-fccab5d78b37-kube-api-access-zscxm\") pod \"84026ec8-3948-49fc-ba35-fccab5d78b37\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.392319 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-config\") pod \"84026ec8-3948-49fc-ba35-fccab5d78b37\" (UID: \"84026ec8-3948-49fc-ba35-fccab5d78b37\") " Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.392722 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08489187-6273-4bfb-b71e-4a0f4ce64103-config" (OuterVolumeSpecName: "config") pod "08489187-6273-4bfb-b71e-4a0f4ce64103" (UID: "08489187-6273-4bfb-b71e-4a0f4ce64103"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.392822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84026ec8-3948-49fc-ba35-fccab5d78b37" (UID: "84026ec8-3948-49fc-ba35-fccab5d78b37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.392858 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-config" (OuterVolumeSpecName: "config") pod "84026ec8-3948-49fc-ba35-fccab5d78b37" (UID: "84026ec8-3948-49fc-ba35-fccab5d78b37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.435909 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08489187-6273-4bfb-b71e-4a0f4ce64103-kube-api-access-zhqpz" (OuterVolumeSpecName: "kube-api-access-zhqpz") pod "08489187-6273-4bfb-b71e-4a0f4ce64103" (UID: "08489187-6273-4bfb-b71e-4a0f4ce64103"). InnerVolumeSpecName "kube-api-access-zhqpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.494426 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.494477 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08489187-6273-4bfb-b71e-4a0f4ce64103-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.494491 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84026ec8-3948-49fc-ba35-fccab5d78b37-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.494503 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhqpz\" (UniqueName: \"kubernetes.io/projected/08489187-6273-4bfb-b71e-4a0f4ce64103-kube-api-access-zhqpz\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.529324 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84026ec8-3948-49fc-ba35-fccab5d78b37-kube-api-access-zscxm" (OuterVolumeSpecName: "kube-api-access-zscxm") pod "84026ec8-3948-49fc-ba35-fccab5d78b37" (UID: "84026ec8-3948-49fc-ba35-fccab5d78b37"). InnerVolumeSpecName "kube-api-access-zscxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.596669 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zscxm\" (UniqueName: \"kubernetes.io/projected/84026ec8-3948-49fc-ba35-fccab5d78b37-kube-api-access-zscxm\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.823135 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k" event={"ID":"b90ccac6-a973-4572-834a-f7215cfc72a7","Type":"ContainerStarted","Data":"62d949a38fede235d36424ee32be0c12d4fd89d246d0092a248e190a566c0074"} Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.826541 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a319b20-00fe-4182-8fdb-1f71c5f4f655" containerID="958144aac3c9104f251ccc54347eb7d0380a5b5ae9a7fa6ae97b8d0da691a98d" exitCode=0 Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.826601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-j4qrz" event={"ID":"7a319b20-00fe-4182-8fdb-1f71c5f4f655","Type":"ContainerDied","Data":"958144aac3c9104f251ccc54347eb7d0380a5b5ae9a7fa6ae97b8d0da691a98d"} Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.828514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" event={"ID":"84026ec8-3948-49fc-ba35-fccab5d78b37","Type":"ContainerDied","Data":"8064d681b03f1fe72fd640241bffdb2ce838dc01e08ba5ce74242a0f75f26615"} Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.828575 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6jdn7" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.834106 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6xllm" event={"ID":"e0f3bd33-05e2-4174-a371-af75ef9fdb7d","Type":"ContainerStarted","Data":"b99681985d9d945a1d35663e3c847b49ead58720d5c5e4490d055cde8d2ec42e"} Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.842437 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b19f3e39-4198-4eef-bbe8-67e28fcef034","Type":"ContainerStarted","Data":"fc57a9bbdedde4843b658be62d45c97e98a25200945f7ae1b36c43e724c8b8f4"} Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.843806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" event={"ID":"08489187-6273-4bfb-b71e-4a0f4ce64103","Type":"ContainerDied","Data":"b02e26824a0858363ee8d6b2590cf9aa7c1830478fc3e641b8889ce7a9d27228"} Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.843906 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q4c77" Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.915931 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6jdn7"] Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.933035 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6jdn7"] Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.957054 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q4c77"] Mar 18 15:56:27 crc kubenswrapper[4792]: I0318 15:56:27.971921 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q4c77"] Mar 18 15:56:28 crc kubenswrapper[4792]: I0318 15:56:28.867289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6","Type":"ContainerStarted","Data":"ae636930fec3372dcc7e75191085bce0723abda454d295f276c7db9542dc1d65"} Mar 18 15:56:28 crc kubenswrapper[4792]: I0318 15:56:28.905258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"753d5ec4-134d-48f9-ad6c-aa17f8856b5a","Type":"ContainerStarted","Data":"7782a9ac2dbdefa63ef4731235464b9470b5ae66917c638938707289f3e36397"} Mar 18 15:56:29 crc kubenswrapper[4792]: I0318 15:56:29.866498 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08489187-6273-4bfb-b71e-4a0f4ce64103" path="/var/lib/kubelet/pods/08489187-6273-4bfb-b71e-4a0f4ce64103/volumes" Mar 18 15:56:29 crc kubenswrapper[4792]: I0318 15:56:29.867334 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84026ec8-3948-49fc-ba35-fccab5d78b37" path="/var/lib/kubelet/pods/84026ec8-3948-49fc-ba35-fccab5d78b37/volumes" Mar 18 15:56:29 crc kubenswrapper[4792]: I0318 15:56:29.922815 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-855f5dc7f-qnkcz" event={"ID":"0e8a660f-2f46-41b3-badb-2d1164cea860","Type":"ContainerStarted","Data":"cd7c0c3de7be9eb21c6faba032148f21d053e8f734b18e56e7b012246aab8710"} Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.328184 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.328273 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.328335 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.330302 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a93068a48274d54a195ba2b1867063b29cb44c6a806452b2de108e9e08cab78f"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.330404 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://a93068a48274d54a195ba2b1867063b29cb44c6a806452b2de108e9e08cab78f" gracePeriod=600 Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.933752 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="a93068a48274d54a195ba2b1867063b29cb44c6a806452b2de108e9e08cab78f" exitCode=0 Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.933924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"a93068a48274d54a195ba2b1867063b29cb44c6a806452b2de108e9e08cab78f"} Mar 18 15:56:30 crc kubenswrapper[4792]: I0318 15:56:30.934128 4792 scope.go:117] "RemoveContainer" containerID="f2f3b1d9e5efb71a659892b0519133711d0d4a704a137b617addb0c6d53c19c2" Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.168588 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-j4qrz" Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.188793 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-855f5dc7f-qnkcz" podStartSLOduration=29.188774917 podStartE2EDuration="29.188774917s" podCreationTimestamp="2026-03-18 15:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:29.951344851 +0000 UTC m=+1338.820673788" watchObservedRunningTime="2026-03-18 15:56:31.188774917 +0000 UTC m=+1340.058103854" Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.274375 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tth8d\" (UniqueName: \"kubernetes.io/projected/7a319b20-00fe-4182-8fdb-1f71c5f4f655-kube-api-access-tth8d\") pod \"7a319b20-00fe-4182-8fdb-1f71c5f4f655\" (UID: \"7a319b20-00fe-4182-8fdb-1f71c5f4f655\") " Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.283666 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a319b20-00fe-4182-8fdb-1f71c5f4f655-kube-api-access-tth8d" (OuterVolumeSpecName: "kube-api-access-tth8d") pod "7a319b20-00fe-4182-8fdb-1f71c5f4f655" (UID: "7a319b20-00fe-4182-8fdb-1f71c5f4f655"). InnerVolumeSpecName "kube-api-access-tth8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.376349 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tth8d\" (UniqueName: \"kubernetes.io/projected/7a319b20-00fe-4182-8fdb-1f71c5f4f655-kube-api-access-tth8d\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.961376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a64098b6-eb41-40ef-8d9b-6dd69c107ee2","Type":"ContainerStarted","Data":"c5fc2cfb1c9722a326bae1f674d2d1adc21ac88f24fd8fcae2b068a57737acf7"} Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.977530 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564156-j4qrz" event={"ID":"7a319b20-00fe-4182-8fdb-1f71c5f4f655","Type":"ContainerDied","Data":"f9d99ab0e06ce1c83cc0b96b69874f08dc9f7c7bf9aad33c7e3a53e78633f9bb"} Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.977568 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9d99ab0e06ce1c83cc0b96b69874f08dc9f7c7bf9aad33c7e3a53e78633f9bb" Mar 18 15:56:31 crc kubenswrapper[4792]: I0318 15:56:31.977640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564156-j4qrz" Mar 18 15:56:32 crc kubenswrapper[4792]: I0318 15:56:32.252857 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-gtk9r"] Mar 18 15:56:32 crc kubenswrapper[4792]: I0318 15:56:32.263512 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564150-gtk9r"] Mar 18 15:56:32 crc kubenswrapper[4792]: I0318 15:56:32.994529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"018a5f60-5274-4779-912f-4d7c32b6bfe5","Type":"ContainerStarted","Data":"877641f65256f870166235530babf01a0e7be7e226ff72cb2f08b823a95eaeeb"} Mar 18 15:56:32 crc kubenswrapper[4792]: I0318 15:56:32.995025 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 15:56:32 crc kubenswrapper[4792]: I0318 15:56:32.997325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f119c5e-bb19-41d0-b87c-4962192e94e5","Type":"ContainerStarted","Data":"4b59d07bff9a30296aa9160dbc74444274016b84769d5c7e2dc49c91e75584d9"} Mar 18 15:56:32 crc kubenswrapper[4792]: I0318 15:56:32.999227 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6xllm" event={"ID":"e0f3bd33-05e2-4174-a371-af75ef9fdb7d","Type":"ContainerStarted","Data":"082a0709f1da28f82b6456c43edb06e3bed6bf2a3a549b0129cfdab2b4911e1f"} Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.003600 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b19f3e39-4198-4eef-bbe8-67e28fcef034","Type":"ContainerStarted","Data":"e6ed43c15f214111b3ebfb09b3c1cd79ff0386894d6f767eb37fa15387477d87"} Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.017582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k" event={"ID":"b90ccac6-a973-4572-834a-f7215cfc72a7","Type":"ContainerStarted","Data":"d172f312d0b9e61b60ebd372f948293337e8de56fdada3082f735e7a5d513690"} Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.018069 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-m977k" Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.022995 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.681283887 podStartE2EDuration="32.02294605s" podCreationTimestamp="2026-03-18 15:56:01 +0000 UTC" firstStartedPulling="2026-03-18 15:56:11.39182148 +0000 UTC m=+1320.261150417" lastFinishedPulling="2026-03-18 15:56:31.733483643 +0000 UTC m=+1340.602812580" observedRunningTime="2026-03-18 15:56:33.01509481 +0000 UTC m=+1341.884423747" watchObservedRunningTime="2026-03-18 15:56:33.02294605 +0000 UTC m=+1341.892274987" Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.025485 4792 generic.go:334] "Generic (PLEG): container finished" podID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerID="ae636930fec3372dcc7e75191085bce0723abda454d295f276c7db9542dc1d65" exitCode=0 Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.025589 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6","Type":"ContainerDied","Data":"ae636930fec3372dcc7e75191085bce0723abda454d295f276c7db9542dc1d65"} Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.033862 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e"} Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.069320 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m977k" podStartSLOduration=24.411649687 podStartE2EDuration="29.06929377s" podCreationTimestamp="2026-03-18 15:56:04 +0000 UTC" firstStartedPulling="2026-03-18 15:56:27.113862224 +0000 UTC m=+1335.983191161" lastFinishedPulling="2026-03-18 15:56:31.771506307 +0000 UTC m=+1340.640835244" observedRunningTime="2026-03-18 15:56:33.053130793 +0000 UTC m=+1341.922459730" watchObservedRunningTime="2026-03-18 15:56:33.06929377 +0000 UTC m=+1341.938622707" Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.378226 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.379876 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.386890 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:33 crc kubenswrapper[4792]: I0318 15:56:33.867495 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f563e18-52c5-4193-b9e9-70d632536974" path="/var/lib/kubelet/pods/5f563e18-52c5-4193-b9e9-70d632536974/volumes" Mar 18 15:56:34 crc kubenswrapper[4792]: I0318 15:56:34.047525 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1","Type":"ContainerStarted","Data":"473056b2f961908787d6d0bbd7279324cef361cac398147642946a887e53c916"} Mar 18 15:56:34 crc kubenswrapper[4792]: I0318 15:56:34.055493 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0987841-aa1a-4130-a8e9-aeab1ba7aade","Type":"ContainerStarted","Data":"f74bb4cde5f8d03c3be1828e98466fefc661c6bd28ec4a480e92e59898e2c8fa"} Mar 18 15:56:34 crc kubenswrapper[4792]: I0318 15:56:34.060284 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0f3bd33-05e2-4174-a371-af75ef9fdb7d" containerID="082a0709f1da28f82b6456c43edb06e3bed6bf2a3a549b0129cfdab2b4911e1f" exitCode=0 Mar 18 15:56:34 crc kubenswrapper[4792]: I0318 15:56:34.062129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6xllm" event={"ID":"e0f3bd33-05e2-4174-a371-af75ef9fdb7d","Type":"ContainerDied","Data":"082a0709f1da28f82b6456c43edb06e3bed6bf2a3a549b0129cfdab2b4911e1f"} Mar 18 15:56:34 crc kubenswrapper[4792]: I0318 15:56:34.068897 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-855f5dc7f-qnkcz" Mar 18 15:56:34 crc kubenswrapper[4792]: I0318 15:56:34.198377 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85f4b56fb6-xnb5g"] Mar 18 15:56:35 crc kubenswrapper[4792]: I0318 15:56:35.071195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6","Type":"ContainerStarted","Data":"138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a"} Mar 18 15:56:35 crc kubenswrapper[4792]: I0318 15:56:35.073476 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerStarted","Data":"fa9bbc577dbb39ebc10f18bbba3367106c73587c29f277b3b0b30ef4c1139b9d"} Mar 18 15:56:35 crc kubenswrapper[4792]: I0318 15:56:35.076110 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"be818cb3-6cf1-4945-a96e-25c124ed1098","Type":"ContainerStarted","Data":"8ca1e9174e1b68cb3a500c36721db12755c75144ada52226620997217d432a3c"} Mar 18 15:56:35 crc kubenswrapper[4792]: I0318 15:56:35.101451 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.362169549 podStartE2EDuration="39.101435941s" podCreationTimestamp="2026-03-18 15:55:56 +0000 UTC" firstStartedPulling="2026-03-18 15:55:58.915685397 +0000 UTC m=+1307.785014334" lastFinishedPulling="2026-03-18 15:56:25.654951779 +0000 UTC m=+1334.524280726" observedRunningTime="2026-03-18 15:56:35.093486997 +0000 UTC m=+1343.962815934" watchObservedRunningTime="2026-03-18 15:56:35.101435941 +0000 UTC m=+1343.970764878" Mar 18 15:56:36 crc kubenswrapper[4792]: I0318 15:56:36.085607 4792 generic.go:334] "Generic (PLEG): container finished" podID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerID="c5fc2cfb1c9722a326bae1f674d2d1adc21ac88f24fd8fcae2b068a57737acf7" exitCode=0 Mar 18 15:56:36 crc kubenswrapper[4792]: I0318 15:56:36.086090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a64098b6-eb41-40ef-8d9b-6dd69c107ee2","Type":"ContainerDied","Data":"c5fc2cfb1c9722a326bae1f674d2d1adc21ac88f24fd8fcae2b068a57737acf7"} Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.096895 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f119c5e-bb19-41d0-b87c-4962192e94e5","Type":"ContainerStarted","Data":"2a0a15b7682d3cb190beb8bc03335a53220b7ed9a7a188d8730c5dc18053e6fc"} Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.100158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6xllm" event={"ID":"e0f3bd33-05e2-4174-a371-af75ef9fdb7d","Type":"ContainerStarted","Data":"600305a8b150b34b6ac785539bb244a60b2e5915d0c5e2e37be6cf87dd58f1dc"} Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.100205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6xllm" event={"ID":"e0f3bd33-05e2-4174-a371-af75ef9fdb7d","Type":"ContainerStarted","Data":"593377ccde1eeed5dcdc5a47cfaf0b8b80de9eb67be51e4906d14de5e3849685"} Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.101097 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.101134 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.103196 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b19f3e39-4198-4eef-bbe8-67e28fcef034","Type":"ContainerStarted","Data":"61ee86951d39ba612af4eb3e55807af008aa25e2895b1f8991cd702c48be3f4b"} Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.106250 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a64098b6-eb41-40ef-8d9b-6dd69c107ee2","Type":"ContainerStarted","Data":"1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862"} Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.120697 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.425880287 podStartE2EDuration="30.120678601s" podCreationTimestamp="2026-03-18 15:56:07 +0000 UTC" firstStartedPulling="2026-03-18 15:56:25.675845436 +0000 UTC m=+1334.545174373" lastFinishedPulling="2026-03-18 15:56:36.37064375 +0000 UTC m=+1345.239972687" observedRunningTime="2026-03-18 15:56:37.116236769 +0000 UTC m=+1345.985565726" watchObservedRunningTime="2026-03-18 15:56:37.120678601 +0000 UTC m=+1345.990007538" Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.155783 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=15.905782301 podStartE2EDuration="40.15576425s" podCreationTimestamp="2026-03-18 15:55:57 +0000 UTC" firstStartedPulling="2026-03-18 15:56:01.634161775 +0000 UTC m=+1310.503490712" lastFinishedPulling="2026-03-18 15:56:25.884143724 +0000 UTC m=+1334.753472661" observedRunningTime="2026-03-18 15:56:37.149014194 +0000 UTC m=+1346.018343141" watchObservedRunningTime="2026-03-18 15:56:37.15576425 +0000 UTC m=+1346.025093177" Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.168301 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-6xllm" podStartSLOduration=28.625147153 podStartE2EDuration="33.16828034s" podCreationTimestamp="2026-03-18 15:56:04 +0000 UTC" firstStartedPulling="2026-03-18 15:56:27.12499589 +0000 UTC m=+1335.994324827" lastFinishedPulling="2026-03-18 15:56:31.668129077 +0000 UTC m=+1340.537458014" observedRunningTime="2026-03-18 15:56:37.164765098 +0000 UTC m=+1346.034094035" watchObservedRunningTime="2026-03-18 15:56:37.16828034 +0000 UTC m=+1346.037609277" Mar 18 15:56:37 crc kubenswrapper[4792]: I0318 15:56:37.188906 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.951262602 podStartE2EDuration="32.188887747s" podCreationTimestamp="2026-03-18 15:56:05 +0000 UTC" firstStartedPulling="2026-03-18 15:56:27.133135099 +0000 UTC m=+1336.002464036" lastFinishedPulling="2026-03-18 15:56:36.370760244 +0000 UTC m=+1345.240089181" observedRunningTime="2026-03-18 15:56:37.182533024 +0000 UTC m=+1346.051861981" watchObservedRunningTime="2026-03-18 15:56:37.188887747 +0000 UTC m=+1346.058216684" Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.116767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4cc19e41-291b-4aa8-b862-2efc890cea99","Type":"ContainerStarted","Data":"d005c56ef5a735aeeba157987328709e8b811004f528f288f74381827a34a8c5"} Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.117492 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.120416 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.120465 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.138560 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.170566819 podStartE2EDuration="39.138538709s" podCreationTimestamp="2026-03-18 15:55:59 +0000 UTC" firstStartedPulling="2026-03-18 15:56:00.417927496 +0000 UTC m=+1309.287256433" lastFinishedPulling="2026-03-18 15:56:37.385899386 +0000 UTC m=+1346.255228323" observedRunningTime="2026-03-18 15:56:38.134300263 +0000 UTC m=+1347.003629210" watchObservedRunningTime="2026-03-18 15:56:38.138538709 +0000 UTC m=+1347.007867646" Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.515515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.515572 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:38 crc kubenswrapper[4792]: I0318 15:56:38.560471 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.170901 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.482608 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-2qggt"] Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.551940 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.551994 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.577050 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-v4226"] Mar 18 15:56:39 crc kubenswrapper[4792]: E0318 15:56:39.577558 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a319b20-00fe-4182-8fdb-1f71c5f4f655" containerName="oc" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.577583 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a319b20-00fe-4182-8fdb-1f71c5f4f655" containerName="oc" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.577837 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a319b20-00fe-4182-8fdb-1f71c5f4f655" containerName="oc" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.578793 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.581778 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.607660 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-62hj5"] Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.611862 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.624770 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.657473 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.659918 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-v4226"] Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696567 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bd4fe675-5de2-4d0f-88c4-611c24091ffa-ovn-rundir\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696592 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bd4fe675-5de2-4d0f-88c4-611c24091ffa-ovs-rundir\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696609 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4fe675-5de2-4d0f-88c4-611c24091ffa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4fe675-5de2-4d0f-88c4-611c24091ffa-config\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-config\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696703 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfzv4\" (UniqueName: \"kubernetes.io/projected/bd4fe675-5de2-4d0f-88c4-611c24091ffa-kube-api-access-nfzv4\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696744 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4fe675-5de2-4d0f-88c4-611c24091ffa-combined-ca-bundle\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696806 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7g6h\" (UniqueName: \"kubernetes.io/projected/3b4b3c48-2731-463d-a64b-bbdcae951e82-kube-api-access-r7g6h\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.696883 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.720029 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-62hj5"] Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.743832 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfzv4\" (UniqueName: \"kubernetes.io/projected/bd4fe675-5de2-4d0f-88c4-611c24091ffa-kube-api-access-nfzv4\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4fe675-5de2-4d0f-88c4-611c24091ffa-combined-ca-bundle\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799590 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7g6h\" (UniqueName: \"kubernetes.io/projected/3b4b3c48-2731-463d-a64b-bbdcae951e82-kube-api-access-r7g6h\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bd4fe675-5de2-4d0f-88c4-611c24091ffa-ovn-rundir\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bd4fe675-5de2-4d0f-88c4-611c24091ffa-ovs-rundir\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799878 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4fe675-5de2-4d0f-88c4-611c24091ffa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799905 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4fe675-5de2-4d0f-88c4-611c24091ffa-config\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.799933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-config\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.801061 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-config\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.807958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.809657 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4fe675-5de2-4d0f-88c4-611c24091ffa-combined-ca-bundle\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.810090 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fx8zk"] Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.810116 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bd4fe675-5de2-4d0f-88c4-611c24091ffa-ovn-rundir\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.810694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4fe675-5de2-4d0f-88c4-611c24091ffa-config\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.811093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bd4fe675-5de2-4d0f-88c4-611c24091ffa-ovs-rundir\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.821814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4fe675-5de2-4d0f-88c4-611c24091ffa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.822392 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.828316 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-ghqxp"] Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.830945 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.851496 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.856874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7g6h\" (UniqueName: \"kubernetes.io/projected/3b4b3c48-2731-463d-a64b-bbdcae951e82-kube-api-access-r7g6h\") pod \"dnsmasq-dns-6bc7876d45-62hj5\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.890773 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfzv4\" (UniqueName: \"kubernetes.io/projected/bd4fe675-5de2-4d0f-88c4-611c24091ffa-kube-api-access-nfzv4\") pod \"ovn-controller-metrics-v4226\" (UID: \"bd4fe675-5de2-4d0f-88c4-611c24091ffa\") " pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.904147 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.904270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2xqs\" (UniqueName: \"kubernetes.io/projected/8e546b43-e259-4d0e-81f8-2381371157bd-kube-api-access-v2xqs\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.904499 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-config\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.904556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.904595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-dns-svc\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.926807 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-v4226" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.957491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:39 crc kubenswrapper[4792]: I0318 15:56:39.969074 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ghqxp"] Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.010634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-config\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.010692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.010733 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-dns-svc\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.010800 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.010850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2xqs\" (UniqueName: \"kubernetes.io/projected/8e546b43-e259-4d0e-81f8-2381371157bd-kube-api-access-v2xqs\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.011899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-config\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.012265 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.012711 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.012928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-dns-svc\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.070008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2xqs\" (UniqueName: \"kubernetes.io/projected/8e546b43-e259-4d0e-81f8-2381371157bd-kube-api-access-v2xqs\") pod \"dnsmasq-dns-8554648995-ghqxp\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.162332 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.196785 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.310637 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.470818 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.603314 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.614376 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.710023 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4bll\" (UniqueName: \"kubernetes.io/projected/9d818297-1bcf-4ad2-805e-b30d10670888-kube-api-access-x4bll\") pod \"9d818297-1bcf-4ad2-805e-b30d10670888\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.710288 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-dns-svc\") pod \"9d818297-1bcf-4ad2-805e-b30d10670888\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.710358 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-config\") pod \"9d818297-1bcf-4ad2-805e-b30d10670888\" (UID: \"9d818297-1bcf-4ad2-805e-b30d10670888\") " Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.711488 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-config" (OuterVolumeSpecName: "config") pod "9d818297-1bcf-4ad2-805e-b30d10670888" (UID: "9d818297-1bcf-4ad2-805e-b30d10670888"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.713775 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.719277 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d818297-1bcf-4ad2-805e-b30d10670888" (UID: "9d818297-1bcf-4ad2-805e-b30d10670888"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.719493 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d818297-1bcf-4ad2-805e-b30d10670888-kube-api-access-x4bll" (OuterVolumeSpecName: "kube-api-access-x4bll") pod "9d818297-1bcf-4ad2-805e-b30d10670888" (UID: "9d818297-1bcf-4ad2-805e-b30d10670888"). InnerVolumeSpecName "kube-api-access-x4bll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.771450 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.773193 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.776487 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.776792 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.777278 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mczh9" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.777328 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.784180 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.811861 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-config\") pod \"4665db60-5e6b-4a39-8750-bfabb9ea2631\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.811963 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njf7l\" (UniqueName: \"kubernetes.io/projected/4665db60-5e6b-4a39-8750-bfabb9ea2631-kube-api-access-njf7l\") pod \"4665db60-5e6b-4a39-8750-bfabb9ea2631\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812111 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-dns-svc\") pod \"4665db60-5e6b-4a39-8750-bfabb9ea2631\" (UID: \"4665db60-5e6b-4a39-8750-bfabb9ea2631\") " Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812433 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-config\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812469 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-scripts\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812511 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812502 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-config" (OuterVolumeSpecName: "config") pod "4665db60-5e6b-4a39-8750-bfabb9ea2631" (UID: "4665db60-5e6b-4a39-8750-bfabb9ea2631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812659 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4665db60-5e6b-4a39-8750-bfabb9ea2631" (UID: "4665db60-5e6b-4a39-8750-bfabb9ea2631"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.812875 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4wq\" (UniqueName: \"kubernetes.io/projected/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-kube-api-access-pz4wq\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.813032 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.813318 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4bll\" (UniqueName: \"kubernetes.io/projected/9d818297-1bcf-4ad2-805e-b30d10670888-kube-api-access-x4bll\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.813343 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.813356 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.813369 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d818297-1bcf-4ad2-805e-b30d10670888-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.813382 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4665db60-5e6b-4a39-8750-bfabb9ea2631-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.816885 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4665db60-5e6b-4a39-8750-bfabb9ea2631-kube-api-access-njf7l" (OuterVolumeSpecName: "kube-api-access-njf7l") pod "4665db60-5e6b-4a39-8750-bfabb9ea2631" (UID: "4665db60-5e6b-4a39-8750-bfabb9ea2631"). InnerVolumeSpecName "kube-api-access-njf7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.914836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-config\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.914899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.915023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-scripts\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.915047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.915108 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.915146 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4wq\" (UniqueName: \"kubernetes.io/projected/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-kube-api-access-pz4wq\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.915209 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.915423 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njf7l\" (UniqueName: \"kubernetes.io/projected/4665db60-5e6b-4a39-8750-bfabb9ea2631-kube-api-access-njf7l\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.918273 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.918842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-scripts\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.918985 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-config\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.921924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.924774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.938928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.942903 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4wq\" (UniqueName: \"kubernetes.io/projected/7c13a8c4-d4ee-4af5-95bd-c28a60350d14-kube-api-access-pz4wq\") pod \"ovn-northd-0\" (UID: \"7c13a8c4-d4ee-4af5-95bd-c28a60350d14\") " pod="openstack/ovn-northd-0" Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.983257 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-62hj5"] Mar 18 15:56:40 crc kubenswrapper[4792]: I0318 15:56:40.995141 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-v4226"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.103615 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.154209 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ghqxp"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.227581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" event={"ID":"9d818297-1bcf-4ad2-805e-b30d10670888","Type":"ContainerDied","Data":"9c22b680f7bad409a4261bdf489c7f20310ff0658890d693a166ef7b6a5c7502"} Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.227699 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-2qggt" Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.233402 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerID="fa9bbc577dbb39ebc10f18bbba3367106c73587c29f277b3b0b30ef4c1139b9d" exitCode=0 Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.233469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerDied","Data":"fa9bbc577dbb39ebc10f18bbba3367106c73587c29f277b3b0b30ef4c1139b9d"} Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.246828 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" event={"ID":"4665db60-5e6b-4a39-8750-bfabb9ea2631","Type":"ContainerDied","Data":"32f1d902126d995012fe1884a137cd1ddd01dcf965a2709a1ab2b0ad4d7d1ba8"} Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.246944 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fx8zk" Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.253263 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-v4226" event={"ID":"bd4fe675-5de2-4d0f-88c4-611c24091ffa","Type":"ContainerStarted","Data":"765d85d54eb819bc0cafc4d8058cc4fb799679d5a56e14b102ae37727e61d1f3"} Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.255896 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" event={"ID":"3b4b3c48-2731-463d-a64b-bbdcae951e82","Type":"ContainerStarted","Data":"56eca9f76da1bf6a35f9039e9d499100a6a2331a8901031bf94d5dd73c38e5e6"} Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.258100 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ghqxp" event={"ID":"8e546b43-e259-4d0e-81f8-2381371157bd","Type":"ContainerStarted","Data":"dc34ece1e0e033647b11168a027ebef8add535ef32b06c45b8edc927fcd7e3a4"} Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.341887 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-2qggt"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.364798 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-2qggt"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.382487 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fx8zk"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.389611 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fx8zk"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.671586 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.871759 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4665db60-5e6b-4a39-8750-bfabb9ea2631" path="/var/lib/kubelet/pods/4665db60-5e6b-4a39-8750-bfabb9ea2631/volumes" Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.872585 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d818297-1bcf-4ad2-805e-b30d10670888" path="/var/lib/kubelet/pods/9d818297-1bcf-4ad2-805e-b30d10670888/volumes" Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.981470 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4ww62"] Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.985381 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:41 crc kubenswrapper[4792]: I0318 15:56:41.989423 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4ww62"] Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.041245 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.065660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c36889-6457-497c-b5bc-3c55e80356cf-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-4ww62\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.065746 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s75m9\" (UniqueName: \"kubernetes.io/projected/09c36889-6457-497c-b5bc-3c55e80356cf-kube-api-access-s75m9\") pod \"mysqld-exporter-openstack-db-create-4ww62\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.101161 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-e409-account-create-update-jmr8x"] Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.102367 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.104883 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.113304 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e409-account-create-update-jmr8x"] Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.168150 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-operator-scripts\") pod \"mysqld-exporter-e409-account-create-update-jmr8x\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.168263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrk6f\" (UniqueName: \"kubernetes.io/projected/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-kube-api-access-jrk6f\") pod \"mysqld-exporter-e409-account-create-update-jmr8x\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.168296 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c36889-6457-497c-b5bc-3c55e80356cf-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-4ww62\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.168328 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s75m9\" (UniqueName: \"kubernetes.io/projected/09c36889-6457-497c-b5bc-3c55e80356cf-kube-api-access-s75m9\") pod \"mysqld-exporter-openstack-db-create-4ww62\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.169120 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c36889-6457-497c-b5bc-3c55e80356cf-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-4ww62\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.187831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s75m9\" (UniqueName: \"kubernetes.io/projected/09c36889-6457-497c-b5bc-3c55e80356cf-kube-api-access-s75m9\") pod \"mysqld-exporter-openstack-db-create-4ww62\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.270783 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-operator-scripts\") pod \"mysqld-exporter-e409-account-create-update-jmr8x\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.270923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrk6f\" (UniqueName: \"kubernetes.io/projected/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-kube-api-access-jrk6f\") pod \"mysqld-exporter-e409-account-create-update-jmr8x\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.272000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-operator-scripts\") pod \"mysqld-exporter-e409-account-create-update-jmr8x\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.272123 4792 generic.go:334] "Generic (PLEG): container finished" podID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerID="71da5f56a6fbec3a48e0d7f3228a022e08687a6077e9db4a7d983aa42a8ab3ea" exitCode=0 Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.272538 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" event={"ID":"3b4b3c48-2731-463d-a64b-bbdcae951e82","Type":"ContainerDied","Data":"71da5f56a6fbec3a48e0d7f3228a022e08687a6077e9db4a7d983aa42a8ab3ea"} Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.277704 4792 generic.go:334] "Generic (PLEG): container finished" podID="8e546b43-e259-4d0e-81f8-2381371157bd" containerID="9f7cb884f537d7764649e668e844c489199bd103b4ae8636dc9c6675e3118b23" exitCode=0 Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.277749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ghqxp" event={"ID":"8e546b43-e259-4d0e-81f8-2381371157bd","Type":"ContainerDied","Data":"9f7cb884f537d7764649e668e844c489199bd103b4ae8636dc9c6675e3118b23"} Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.279618 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7c13a8c4-d4ee-4af5-95bd-c28a60350d14","Type":"ContainerStarted","Data":"5ca770074b62b9168fb35e6610112016d43d915abc052f513d5d1da9c5ea8560"} Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.281098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-v4226" event={"ID":"bd4fe675-5de2-4d0f-88c4-611c24091ffa","Type":"ContainerStarted","Data":"63d58ea168af954d625e589eec4e7332525bbc2833fcd94e97b44ef1a78ef06f"} Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.298817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrk6f\" (UniqueName: \"kubernetes.io/projected/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-kube-api-access-jrk6f\") pod \"mysqld-exporter-e409-account-create-update-jmr8x\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.329374 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.397557 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-v4226" podStartSLOduration=3.397533617 podStartE2EDuration="3.397533617s" podCreationTimestamp="2026-03-18 15:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:42.340821757 +0000 UTC m=+1351.210150694" watchObservedRunningTime="2026-03-18 15:56:42.397533617 +0000 UTC m=+1351.266862554" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.447563 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.700265 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:42 crc kubenswrapper[4792]: I0318 15:56:42.796799 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 15:56:43 crc kubenswrapper[4792]: I0318 15:56:43.519614 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e409-account-create-update-jmr8x"] Mar 18 15:56:43 crc kubenswrapper[4792]: W0318 15:56:43.532099 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb6d4a8_112c_41fa_b6f2_41b6d7882908.slice/crio-e5e4402f5786fc41554ba50193adeadf59533ccc7f860ff4e7d63bfa98db2ffa WatchSource:0}: Error finding container e5e4402f5786fc41554ba50193adeadf59533ccc7f860ff4e7d63bfa98db2ffa: Status 404 returned error can't find the container with id e5e4402f5786fc41554ba50193adeadf59533ccc7f860ff4e7d63bfa98db2ffa Mar 18 15:56:43 crc kubenswrapper[4792]: W0318 15:56:43.674684 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09c36889_6457_497c_b5bc_3c55e80356cf.slice/crio-97ec3b26daf9a373cd12c1115c7dca88dc274c1a1a9e6377ccfd23ae23ba98c0 WatchSource:0}: Error finding container 97ec3b26daf9a373cd12c1115c7dca88dc274c1a1a9e6377ccfd23ae23ba98c0: Status 404 returned error can't find the container with id 97ec3b26daf9a373cd12c1115c7dca88dc274c1a1a9e6377ccfd23ae23ba98c0 Mar 18 15:56:43 crc kubenswrapper[4792]: I0318 15:56:43.675104 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4ww62"] Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.302630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" event={"ID":"675f6ffb-b144-4efc-b47a-81c748cb4765","Type":"ContainerStarted","Data":"892dbf5dd65d4a2f3b0ec933fc8ed31cea21374191549deb1eaa891698070b0a"} Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.307138 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ghqxp" event={"ID":"8e546b43-e259-4d0e-81f8-2381371157bd","Type":"ContainerStarted","Data":"1a49425564367c011a4dc8f6f02fb109cfcd29cda98c9a5d255c481d60111d94"} Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.308264 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.310667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" event={"ID":"3bb6d4a8-112c-41fa-b6f2-41b6d7882908","Type":"ContainerStarted","Data":"dc5297a5d704e6189454e7f868c627788f3f4cc640d63594e9197c9c1b062f0d"} Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.310712 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" event={"ID":"3bb6d4a8-112c-41fa-b6f2-41b6d7882908","Type":"ContainerStarted","Data":"e5e4402f5786fc41554ba50193adeadf59533ccc7f860ff4e7d63bfa98db2ffa"} Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.313916 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" event={"ID":"09c36889-6457-497c-b5bc-3c55e80356cf","Type":"ContainerStarted","Data":"33d6e242b773390ac2be00843c0f5628fccbd0d8a9dadbdcab6df60e794e709b"} Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.313947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" event={"ID":"09c36889-6457-497c-b5bc-3c55e80356cf","Type":"ContainerStarted","Data":"97ec3b26daf9a373cd12c1115c7dca88dc274c1a1a9e6377ccfd23ae23ba98c0"} Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.331802 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" event={"ID":"3b4b3c48-2731-463d-a64b-bbdcae951e82","Type":"ContainerStarted","Data":"ecdb27585784ead5aa6df245a5ca3aba0bad0e823ddb60af7cad011f11006b0d"} Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.332735 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.334555 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-ct7p5" podStartSLOduration=24.854921869000002 podStartE2EDuration="42.334543172s" podCreationTimestamp="2026-03-18 15:56:02 +0000 UTC" firstStartedPulling="2026-03-18 15:56:25.651708755 +0000 UTC m=+1334.521037692" lastFinishedPulling="2026-03-18 15:56:43.131330058 +0000 UTC m=+1352.000658995" observedRunningTime="2026-03-18 15:56:44.327928391 +0000 UTC m=+1353.197257328" watchObservedRunningTime="2026-03-18 15:56:44.334543172 +0000 UTC m=+1353.203872109" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.355991 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-ghqxp" podStartSLOduration=4.8339232039999995 podStartE2EDuration="5.355959426s" podCreationTimestamp="2026-03-18 15:56:39 +0000 UTC" firstStartedPulling="2026-03-18 15:56:41.170810212 +0000 UTC m=+1350.040139149" lastFinishedPulling="2026-03-18 15:56:41.692846434 +0000 UTC m=+1350.562175371" observedRunningTime="2026-03-18 15:56:44.34954248 +0000 UTC m=+1353.218871417" watchObservedRunningTime="2026-03-18 15:56:44.355959426 +0000 UTC m=+1353.225288363" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.388439 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" podStartSLOduration=3.388418031 podStartE2EDuration="3.388418031s" podCreationTimestamp="2026-03-18 15:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:44.3664578 +0000 UTC m=+1353.235786747" watchObservedRunningTime="2026-03-18 15:56:44.388418031 +0000 UTC m=+1353.257746968" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.394731 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" podStartSLOduration=2.394714422 podStartE2EDuration="2.394714422s" podCreationTimestamp="2026-03-18 15:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:44.38022345 +0000 UTC m=+1353.249552397" watchObservedRunningTime="2026-03-18 15:56:44.394714422 +0000 UTC m=+1353.264043359" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.414402 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" podStartSLOduration=4.803784952 podStartE2EDuration="5.41437678s" podCreationTimestamp="2026-03-18 15:56:39 +0000 UTC" firstStartedPulling="2026-03-18 15:56:40.986721097 +0000 UTC m=+1349.856050034" lastFinishedPulling="2026-03-18 15:56:41.597312925 +0000 UTC m=+1350.466641862" observedRunningTime="2026-03-18 15:56:44.399144494 +0000 UTC m=+1353.268473431" watchObservedRunningTime="2026-03-18 15:56:44.41437678 +0000 UTC m=+1353.283705717" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.416112 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.635778 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7skkc"] Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.637880 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.659321 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7skkc"] Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.749327 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5d26-account-create-update-r7cpt"] Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.751618 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.765341 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.770556 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5d26-account-create-update-r7cpt"] Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.771324 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nbdm\" (UniqueName: \"kubernetes.io/projected/6f526f38-b789-46a5-96c9-9d2c5a820e51-kube-api-access-2nbdm\") pod \"glance-db-create-7skkc\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.771480 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f526f38-b789-46a5-96c9-9d2c5a820e51-operator-scripts\") pod \"glance-db-create-7skkc\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.873809 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-operator-scripts\") pod \"glance-5d26-account-create-update-r7cpt\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.873917 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbdm\" (UniqueName: \"kubernetes.io/projected/6f526f38-b789-46a5-96c9-9d2c5a820e51-kube-api-access-2nbdm\") pod \"glance-db-create-7skkc\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.874010 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f526f38-b789-46a5-96c9-9d2c5a820e51-operator-scripts\") pod \"glance-db-create-7skkc\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.874077 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jkb\" (UniqueName: \"kubernetes.io/projected/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-kube-api-access-q6jkb\") pod \"glance-5d26-account-create-update-r7cpt\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.874688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f526f38-b789-46a5-96c9-9d2c5a820e51-operator-scripts\") pod \"glance-db-create-7skkc\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.896761 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbdm\" (UniqueName: \"kubernetes.io/projected/6f526f38-b789-46a5-96c9-9d2c5a820e51-kube-api-access-2nbdm\") pod \"glance-db-create-7skkc\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.966403 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7skkc" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.975415 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jkb\" (UniqueName: \"kubernetes.io/projected/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-kube-api-access-q6jkb\") pod \"glance-5d26-account-create-update-r7cpt\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.975773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-operator-scripts\") pod \"glance-5d26-account-create-update-r7cpt\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.977105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-operator-scripts\") pod \"glance-5d26-account-create-update-r7cpt\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:44 crc kubenswrapper[4792]: I0318 15:56:44.998541 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jkb\" (UniqueName: \"kubernetes.io/projected/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-kube-api-access-q6jkb\") pod \"glance-5d26-account-create-update-r7cpt\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:45 crc kubenswrapper[4792]: I0318 15:56:45.078435 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:45 crc kubenswrapper[4792]: I0318 15:56:45.650208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7skkc"] Mar 18 15:56:45 crc kubenswrapper[4792]: W0318 15:56:45.654874 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f526f38_b789_46a5_96c9_9d2c5a820e51.slice/crio-28cf0505389e0e39f9c6d3d564ac1e25e79160161428da7b01b7d42f70dc389a WatchSource:0}: Error finding container 28cf0505389e0e39f9c6d3d564ac1e25e79160161428da7b01b7d42f70dc389a: Status 404 returned error can't find the container with id 28cf0505389e0e39f9c6d3d564ac1e25e79160161428da7b01b7d42f70dc389a Mar 18 15:56:45 crc kubenswrapper[4792]: I0318 15:56:45.731702 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5d26-account-create-update-r7cpt"] Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.382882 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7skkc" event={"ID":"6f526f38-b789-46a5-96c9-9d2c5a820e51","Type":"ContainerStarted","Data":"28cf0505389e0e39f9c6d3d564ac1e25e79160161428da7b01b7d42f70dc389a"} Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.386565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d26-account-create-update-r7cpt" event={"ID":"4d342d2e-9afe-442f-9740-ba8f58e4f2b4","Type":"ContainerStarted","Data":"a20675a875868b1779245bcde43cc0d32a25e128ae7383f3b977228e17ddfda2"} Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.388657 4792 generic.go:334] "Generic (PLEG): container finished" podID="09c36889-6457-497c-b5bc-3c55e80356cf" containerID="33d6e242b773390ac2be00843c0f5628fccbd0d8a9dadbdcab6df60e794e709b" exitCode=0 Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.388721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" event={"ID":"09c36889-6457-497c-b5bc-3c55e80356cf","Type":"ContainerDied","Data":"33d6e242b773390ac2be00843c0f5628fccbd0d8a9dadbdcab6df60e794e709b"} Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.402648 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-p2wms"] Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.404664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.410643 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.428508 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p2wms"] Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.509210 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-operator-scripts\") pod \"root-account-create-update-p2wms\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.509345 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbs4\" (UniqueName: \"kubernetes.io/projected/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-kube-api-access-fwbs4\") pod \"root-account-create-update-p2wms\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.612827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-operator-scripts\") pod \"root-account-create-update-p2wms\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.613533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbs4\" (UniqueName: \"kubernetes.io/projected/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-kube-api-access-fwbs4\") pod \"root-account-create-update-p2wms\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.613797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-operator-scripts\") pod \"root-account-create-update-p2wms\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.646248 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbs4\" (UniqueName: \"kubernetes.io/projected/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-kube-api-access-fwbs4\") pod \"root-account-create-update-p2wms\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:46 crc kubenswrapper[4792]: I0318 15:56:46.743727 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:47 crc kubenswrapper[4792]: I0318 15:56:47.219853 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p2wms"] Mar 18 15:56:47 crc kubenswrapper[4792]: W0318 15:56:47.229385 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39cc7bb1_7d0e_4efd_bf91_1774a5e3765b.slice/crio-7c274c5023200c1124cedc71578492d6e1be21ea97937295b198ac86e9455206 WatchSource:0}: Error finding container 7c274c5023200c1124cedc71578492d6e1be21ea97937295b198ac86e9455206: Status 404 returned error can't find the container with id 7c274c5023200c1124cedc71578492d6e1be21ea97937295b198ac86e9455206 Mar 18 15:56:47 crc kubenswrapper[4792]: I0318 15:56:47.406409 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2wms" event={"ID":"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b","Type":"ContainerStarted","Data":"7c274c5023200c1124cedc71578492d6e1be21ea97937295b198ac86e9455206"} Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.143732 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.260248 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s75m9\" (UniqueName: \"kubernetes.io/projected/09c36889-6457-497c-b5bc-3c55e80356cf-kube-api-access-s75m9\") pod \"09c36889-6457-497c-b5bc-3c55e80356cf\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.260327 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c36889-6457-497c-b5bc-3c55e80356cf-operator-scripts\") pod \"09c36889-6457-497c-b5bc-3c55e80356cf\" (UID: \"09c36889-6457-497c-b5bc-3c55e80356cf\") " Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.261185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c36889-6457-497c-b5bc-3c55e80356cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09c36889-6457-497c-b5bc-3c55e80356cf" (UID: "09c36889-6457-497c-b5bc-3c55e80356cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.266417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c36889-6457-497c-b5bc-3c55e80356cf-kube-api-access-s75m9" (OuterVolumeSpecName: "kube-api-access-s75m9") pod "09c36889-6457-497c-b5bc-3c55e80356cf" (UID: "09c36889-6457-497c-b5bc-3c55e80356cf"). InnerVolumeSpecName "kube-api-access-s75m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.363049 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s75m9\" (UniqueName: \"kubernetes.io/projected/09c36889-6457-497c-b5bc-3c55e80356cf-kube-api-access-s75m9\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.363083 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c36889-6457-497c-b5bc-3c55e80356cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.421906 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" event={"ID":"09c36889-6457-497c-b5bc-3c55e80356cf","Type":"ContainerDied","Data":"97ec3b26daf9a373cd12c1115c7dca88dc274c1a1a9e6377ccfd23ae23ba98c0"} Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.421990 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97ec3b26daf9a373cd12c1115c7dca88dc274c1a1a9e6377ccfd23ae23ba98c0" Mar 18 15:56:48 crc kubenswrapper[4792]: I0318 15:56:48.422087 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4ww62" Mar 18 15:56:49 crc kubenswrapper[4792]: I0318 15:56:49.960163 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.165761 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.246643 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-62hj5"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.374246 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-n7whx"] Mar 18 15:56:50 crc kubenswrapper[4792]: E0318 15:56:50.374762 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c36889-6457-497c-b5bc-3c55e80356cf" containerName="mariadb-database-create" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.374787 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c36889-6457-497c-b5bc-3c55e80356cf" containerName="mariadb-database-create" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.375107 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c36889-6457-497c-b5bc-3c55e80356cf" containerName="mariadb-database-create" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.375962 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.393065 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n7whx"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.458267 4792 generic.go:334] "Generic (PLEG): container finished" podID="3bb6d4a8-112c-41fa-b6f2-41b6d7882908" containerID="dc5297a5d704e6189454e7f868c627788f3f4cc640d63594e9197c9c1b062f0d" exitCode=0 Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.458371 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" event={"ID":"3bb6d4a8-112c-41fa-b6f2-41b6d7882908","Type":"ContainerDied","Data":"dc5297a5d704e6189454e7f868c627788f3f4cc640d63594e9197c9c1b062f0d"} Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.471754 4792 generic.go:334] "Generic (PLEG): container finished" podID="4d342d2e-9afe-442f-9740-ba8f58e4f2b4" containerID="26b8254bf32c9f070adef10c58cc4b1b07e1b20350731bbbf7b72ab9160d432f" exitCode=0 Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.471901 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d26-account-create-update-r7cpt" event={"ID":"4d342d2e-9afe-442f-9740-ba8f58e4f2b4","Type":"ContainerDied","Data":"26b8254bf32c9f070adef10c58cc4b1b07e1b20350731bbbf7b72ab9160d432f"} Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.473931 4792 generic.go:334] "Generic (PLEG): container finished" podID="39cc7bb1-7d0e-4efd-bf91-1774a5e3765b" containerID="2b3d50ec8af83e573a82f3981293eacc6e8aef89012c7ea6b8ed678f9d336d9b" exitCode=0 Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.474059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2wms" event={"ID":"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b","Type":"ContainerDied","Data":"2b3d50ec8af83e573a82f3981293eacc6e8aef89012c7ea6b8ed678f9d336d9b"} Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.481153 4792 generic.go:334] "Generic (PLEG): container finished" podID="6f526f38-b789-46a5-96c9-9d2c5a820e51" containerID="f16c1aa5e8f590b86442dd8e688ed4bc4f4200e4f31730326640ee2d9898930d" exitCode=0 Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.481291 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7skkc" event={"ID":"6f526f38-b789-46a5-96c9-9d2c5a820e51","Type":"ContainerDied","Data":"f16c1aa5e8f590b86442dd8e688ed4bc4f4200e4f31730326640ee2d9898930d"} Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.481442 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerName="dnsmasq-dns" containerID="cri-o://ecdb27585784ead5aa6df245a5ca3aba0bad0e823ddb60af7cad011f11006b0d" gracePeriod=10 Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.490855 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-98a6-account-create-update-hg25j"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.492410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.499384 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.505475 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2156b56d-e393-4230-b0ae-057041cee710-operator-scripts\") pod \"keystone-db-create-n7whx\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.505704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7jch\" (UniqueName: \"kubernetes.io/projected/2156b56d-e393-4230-b0ae-057041cee710-kube-api-access-k7jch\") pod \"keystone-db-create-n7whx\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.510298 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-98a6-account-create-update-hg25j"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.607231 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7jch\" (UniqueName: \"kubernetes.io/projected/2156b56d-e393-4230-b0ae-057041cee710-kube-api-access-k7jch\") pod \"keystone-db-create-n7whx\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.607372 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2156b56d-e393-4230-b0ae-057041cee710-operator-scripts\") pod \"keystone-db-create-n7whx\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.607421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be42760-b224-4ca3-8870-92131f90c77b-operator-scripts\") pod \"keystone-98a6-account-create-update-hg25j\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.607447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntkk4\" (UniqueName: \"kubernetes.io/projected/6be42760-b224-4ca3-8870-92131f90c77b-kube-api-access-ntkk4\") pod \"keystone-98a6-account-create-update-hg25j\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.608606 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2156b56d-e393-4230-b0ae-057041cee710-operator-scripts\") pod \"keystone-db-create-n7whx\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.633721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7jch\" (UniqueName: \"kubernetes.io/projected/2156b56d-e393-4230-b0ae-057041cee710-kube-api-access-k7jch\") pod \"keystone-db-create-n7whx\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.676726 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ft5cr"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.680959 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.700469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7whx" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.703525 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ft5cr"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.708778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be42760-b224-4ca3-8870-92131f90c77b-operator-scripts\") pod \"keystone-98a6-account-create-update-hg25j\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.708816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntkk4\" (UniqueName: \"kubernetes.io/projected/6be42760-b224-4ca3-8870-92131f90c77b-kube-api-access-ntkk4\") pod \"keystone-98a6-account-create-update-hg25j\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.709708 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be42760-b224-4ca3-8870-92131f90c77b-operator-scripts\") pod \"keystone-98a6-account-create-update-hg25j\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.749622 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntkk4\" (UniqueName: \"kubernetes.io/projected/6be42760-b224-4ca3-8870-92131f90c77b-kube-api-access-ntkk4\") pod \"keystone-98a6-account-create-update-hg25j\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.805052 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f402-account-create-update-qxl8l"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.808907 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.811699 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7jj\" (UniqueName: \"kubernetes.io/projected/9fe6d817-db7d-4864-9cfb-1a399587c3b9-kube-api-access-xm7jj\") pod \"placement-db-create-ft5cr\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.811889 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.812098 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fe6d817-db7d-4864-9cfb-1a399587c3b9-operator-scripts\") pod \"placement-db-create-ft5cr\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.834680 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f402-account-create-update-qxl8l"] Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.864349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.914171 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdzh5\" (UniqueName: \"kubernetes.io/projected/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-kube-api-access-jdzh5\") pod \"placement-f402-account-create-update-qxl8l\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.914362 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7jj\" (UniqueName: \"kubernetes.io/projected/9fe6d817-db7d-4864-9cfb-1a399587c3b9-kube-api-access-xm7jj\") pod \"placement-db-create-ft5cr\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.914502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-operator-scripts\") pod \"placement-f402-account-create-update-qxl8l\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.914588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fe6d817-db7d-4864-9cfb-1a399587c3b9-operator-scripts\") pod \"placement-db-create-ft5cr\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.915423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fe6d817-db7d-4864-9cfb-1a399587c3b9-operator-scripts\") pod \"placement-db-create-ft5cr\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:50 crc kubenswrapper[4792]: I0318 15:56:50.944510 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7jj\" (UniqueName: \"kubernetes.io/projected/9fe6d817-db7d-4864-9cfb-1a399587c3b9-kube-api-access-xm7jj\") pod \"placement-db-create-ft5cr\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.006816 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ft5cr" Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.016557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-operator-scripts\") pod \"placement-f402-account-create-update-qxl8l\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.016697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdzh5\" (UniqueName: \"kubernetes.io/projected/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-kube-api-access-jdzh5\") pod \"placement-f402-account-create-update-qxl8l\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.017934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-operator-scripts\") pod \"placement-f402-account-create-update-qxl8l\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.057142 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdzh5\" (UniqueName: \"kubernetes.io/projected/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-kube-api-access-jdzh5\") pod \"placement-f402-account-create-update-qxl8l\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.135335 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.504306 4792 generic.go:334] "Generic (PLEG): container finished" podID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerID="ecdb27585784ead5aa6df245a5ca3aba0bad0e823ddb60af7cad011f11006b0d" exitCode=0 Mar 18 15:56:51 crc kubenswrapper[4792]: I0318 15:56:51.504773 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" event={"ID":"3b4b3c48-2731-463d-a64b-bbdcae951e82","Type":"ContainerDied","Data":"ecdb27585784ead5aa6df245a5ca3aba0bad0e823ddb60af7cad011f11006b0d"} Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.066046 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-fdrj5"] Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.070357 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.095918 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-fdrj5"] Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.164905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxwm\" (UniqueName: \"kubernetes.io/projected/fa8c1204-320e-41c5-8393-c13f50febe7e-kube-api-access-tmxwm\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.165424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.165600 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-config\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.165806 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.165983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.268354 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxwm\" (UniqueName: \"kubernetes.io/projected/fa8c1204-320e-41c5-8393-c13f50febe7e-kube-api-access-tmxwm\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.268486 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.268562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-config\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.268639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.268676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.269768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.269790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.270503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-config\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.270736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.297818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxwm\" (UniqueName: \"kubernetes.io/projected/fa8c1204-320e-41c5-8393-c13f50febe7e-kube-api-access-tmxwm\") pod \"dnsmasq-dns-b8fbc5445-fdrj5\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:52 crc kubenswrapper[4792]: I0318 15:56:52.395507 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.298960 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.309446 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.311567 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-55fs6" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.311727 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.311818 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.311937 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.324740 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.407248 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3c345496-7b4e-41f0-a5ae-4c503e452221-cache\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.407321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3c345496-7b4e-41f0-a5ae-4c503e452221-lock\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.407395 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9qd\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-kube-api-access-qq9qd\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.407535 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.407782 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c345496-7b4e-41f0-a5ae-4c503e452221-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.407862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.509847 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.510111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c345496-7b4e-41f0-a5ae-4c503e452221-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.510179 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.510237 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3c345496-7b4e-41f0-a5ae-4c503e452221-cache\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.510281 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3c345496-7b4e-41f0-a5ae-4c503e452221-lock\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: E0318 15:56:53.510444 4792 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:56:53 crc kubenswrapper[4792]: E0318 15:56:53.510483 4792 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:56:53 crc kubenswrapper[4792]: E0318 15:56:53.510551 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift podName:3c345496-7b4e-41f0-a5ae-4c503e452221 nodeName:}" failed. No retries permitted until 2026-03-18 15:56:54.01052464 +0000 UTC m=+1362.879853637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift") pod "swift-storage-0" (UID: "3c345496-7b4e-41f0-a5ae-4c503e452221") : configmap "swift-ring-files" not found Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.510604 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9qd\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-kube-api-access-qq9qd\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.511119 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3c345496-7b4e-41f0-a5ae-4c503e452221-cache\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.511422 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3c345496-7b4e-41f0-a5ae-4c503e452221-lock\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.512824 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.512883 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0e3fd556e2dd171f903ec534fb756b8b153090802fc13c1c30d2baa8126c4cac/globalmount\"" pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.520138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c345496-7b4e-41f0-a5ae-4c503e452221-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.535090 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9qd\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-kube-api-access-qq9qd\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.576958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfbf2b6d-fab1-4283-a09e-a2fd40855535\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.878159 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nxf67"] Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.879709 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.882849 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.882920 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.882955 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.902832 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nxf67"] Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.966383 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nxf67"] Mar 18 15:56:53 crc kubenswrapper[4792]: E0318 15:56:53.967229 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-5q9tk ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-5q9tk ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-nxf67" podUID="3ef81cd2-82fe-4fc7-ae71-0496c310f213" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.982803 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rks8d"] Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.984567 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:53 crc kubenswrapper[4792]: I0318 15:56:53.991963 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rks8d"] Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.023983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef81cd2-82fe-4fc7-ae71-0496c310f213-etc-swift\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.024321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-combined-ca-bundle\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.024958 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-ring-data-devices\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.025096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.025263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-dispersionconf\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: E0318 15:56:54.025297 4792 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.025319 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-scripts\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: E0318 15:56:54.025323 4792 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.025413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9tk\" (UniqueName: \"kubernetes.io/projected/3ef81cd2-82fe-4fc7-ae71-0496c310f213-kube-api-access-5q9tk\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: E0318 15:56:54.025440 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift podName:3c345496-7b4e-41f0-a5ae-4c503e452221 nodeName:}" failed. No retries permitted until 2026-03-18 15:56:55.025406904 +0000 UTC m=+1363.894736031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift") pod "swift-storage-0" (UID: "3c345496-7b4e-41f0-a5ae-4c503e452221") : configmap "swift-ring-files" not found Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.025613 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-swiftconf\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.127745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-combined-ca-bundle\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128237 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-scripts\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128270 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-ring-data-devices\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-ring-data-devices\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-dispersionconf\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-etc-swift\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128418 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-dispersionconf\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128444 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-scripts\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9tk\" (UniqueName: \"kubernetes.io/projected/3ef81cd2-82fe-4fc7-ae71-0496c310f213-kube-api-access-5q9tk\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128503 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdb55\" (UniqueName: \"kubernetes.io/projected/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-kube-api-access-pdb55\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128525 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-swiftconf\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef81cd2-82fe-4fc7-ae71-0496c310f213-etc-swift\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128602 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-combined-ca-bundle\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.128653 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-swiftconf\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.129448 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-ring-data-devices\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.129962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef81cd2-82fe-4fc7-ae71-0496c310f213-etc-swift\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.130845 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-scripts\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.133186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-swiftconf\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.136112 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-dispersionconf\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.142640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-combined-ca-bundle\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.160845 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9tk\" (UniqueName: \"kubernetes.io/projected/3ef81cd2-82fe-4fc7-ae71-0496c310f213-kube-api-access-5q9tk\") pod \"swift-ring-rebalance-nxf67\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.231491 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-combined-ca-bundle\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.231603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-swiftconf\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.231704 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-scripts\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.231753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-ring-data-devices\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.231822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-dispersionconf\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.231854 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-etc-swift\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.231915 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdb55\" (UniqueName: \"kubernetes.io/projected/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-kube-api-access-pdb55\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.232842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-ring-data-devices\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.232945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-etc-swift\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.233365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-scripts\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.235954 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-dispersionconf\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.236960 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-swiftconf\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.242285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-combined-ca-bundle\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.260623 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdb55\" (UniqueName: \"kubernetes.io/projected/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-kube-api-access-pdb55\") pod \"swift-ring-rebalance-rks8d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.310526 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.550272 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.585503 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.642923 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-ring-data-devices\") pod \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643333 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef81cd2-82fe-4fc7-ae71-0496c310f213-etc-swift\") pod \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-scripts\") pod \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643574 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-combined-ca-bundle\") pod \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643603 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3ef81cd2-82fe-4fc7-ae71-0496c310f213" (UID: "3ef81cd2-82fe-4fc7-ae71-0496c310f213"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643642 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-dispersionconf\") pod \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643672 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9tk\" (UniqueName: \"kubernetes.io/projected/3ef81cd2-82fe-4fc7-ae71-0496c310f213-kube-api-access-5q9tk\") pod \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643786 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-swiftconf\") pod \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\" (UID: \"3ef81cd2-82fe-4fc7-ae71-0496c310f213\") " Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643867 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef81cd2-82fe-4fc7-ae71-0496c310f213-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3ef81cd2-82fe-4fc7-ae71-0496c310f213" (UID: "3ef81cd2-82fe-4fc7-ae71-0496c310f213"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.643941 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-scripts" (OuterVolumeSpecName: "scripts") pod "3ef81cd2-82fe-4fc7-ae71-0496c310f213" (UID: "3ef81cd2-82fe-4fc7-ae71-0496c310f213"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.644584 4792 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.644609 4792 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef81cd2-82fe-4fc7-ae71-0496c310f213-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.644618 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef81cd2-82fe-4fc7-ae71-0496c310f213-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.649705 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ef81cd2-82fe-4fc7-ae71-0496c310f213" (UID: "3ef81cd2-82fe-4fc7-ae71-0496c310f213"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.649762 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3ef81cd2-82fe-4fc7-ae71-0496c310f213" (UID: "3ef81cd2-82fe-4fc7-ae71-0496c310f213"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.649950 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3ef81cd2-82fe-4fc7-ae71-0496c310f213" (UID: "3ef81cd2-82fe-4fc7-ae71-0496c310f213"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.650365 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef81cd2-82fe-4fc7-ae71-0496c310f213-kube-api-access-5q9tk" (OuterVolumeSpecName: "kube-api-access-5q9tk") pod "3ef81cd2-82fe-4fc7-ae71-0496c310f213" (UID: "3ef81cd2-82fe-4fc7-ae71-0496c310f213"). InnerVolumeSpecName "kube-api-access-5q9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.750341 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.750390 4792 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.750404 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9tk\" (UniqueName: \"kubernetes.io/projected/3ef81cd2-82fe-4fc7-ae71-0496c310f213-kube-api-access-5q9tk\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.750419 4792 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef81cd2-82fe-4fc7-ae71-0496c310f213-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:54 crc kubenswrapper[4792]: I0318 15:56:54.959209 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Mar 18 15:56:55 crc kubenswrapper[4792]: I0318 15:56:55.058705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:55 crc kubenswrapper[4792]: E0318 15:56:55.058949 4792 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:56:55 crc kubenswrapper[4792]: E0318 15:56:55.059005 4792 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:56:55 crc kubenswrapper[4792]: E0318 15:56:55.059069 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift podName:3c345496-7b4e-41f0-a5ae-4c503e452221 nodeName:}" failed. No retries permitted until 2026-03-18 15:56:57.059048576 +0000 UTC m=+1365.928377513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift") pod "swift-storage-0" (UID: "3c345496-7b4e-41f0-a5ae-4c503e452221") : configmap "swift-ring-files" not found Mar 18 15:56:55 crc kubenswrapper[4792]: I0318 15:56:55.558485 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nxf67" Mar 18 15:56:55 crc kubenswrapper[4792]: I0318 15:56:55.643807 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nxf67"] Mar 18 15:56:55 crc kubenswrapper[4792]: I0318 15:56:55.653441 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-nxf67"] Mar 18 15:56:55 crc kubenswrapper[4792]: I0318 15:56:55.871385 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef81cd2-82fe-4fc7-ae71-0496c310f213" path="/var/lib/kubelet/pods/3ef81cd2-82fe-4fc7-ae71-0496c310f213/volumes" Mar 18 15:56:55 crc kubenswrapper[4792]: I0318 15:56:55.929953 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:55 crc kubenswrapper[4792]: I0318 15:56:55.936644 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:55.992651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-operator-scripts\") pod \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:55.993047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-operator-scripts\") pod \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:55.993234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6jkb\" (UniqueName: \"kubernetes.io/projected/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-kube-api-access-q6jkb\") pod \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\" (UID: \"4d342d2e-9afe-442f-9740-ba8f58e4f2b4\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:55.993292 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwbs4\" (UniqueName: \"kubernetes.io/projected/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-kube-api-access-fwbs4\") pod \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\" (UID: \"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:55.994843 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39cc7bb1-7d0e-4efd-bf91-1774a5e3765b" (UID: "39cc7bb1-7d0e-4efd-bf91-1774a5e3765b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:55.995384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d342d2e-9afe-442f-9740-ba8f58e4f2b4" (UID: "4d342d2e-9afe-442f-9740-ba8f58e4f2b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:55.998309 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.006738 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-kube-api-access-q6jkb" (OuterVolumeSpecName: "kube-api-access-q6jkb") pod "4d342d2e-9afe-442f-9740-ba8f58e4f2b4" (UID: "4d342d2e-9afe-442f-9740-ba8f58e4f2b4"). InnerVolumeSpecName "kube-api-access-q6jkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.010355 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7skkc" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.054285 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-kube-api-access-fwbs4" (OuterVolumeSpecName: "kube-api-access-fwbs4") pod "39cc7bb1-7d0e-4efd-bf91-1774a5e3765b" (UID: "39cc7bb1-7d0e-4efd-bf91-1774a5e3765b"). InnerVolumeSpecName "kube-api-access-fwbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.097564 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrk6f\" (UniqueName: \"kubernetes.io/projected/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-kube-api-access-jrk6f\") pod \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.097716 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nbdm\" (UniqueName: \"kubernetes.io/projected/6f526f38-b789-46a5-96c9-9d2c5a820e51-kube-api-access-2nbdm\") pod \"6f526f38-b789-46a5-96c9-9d2c5a820e51\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.097889 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-operator-scripts\") pod \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\" (UID: \"3bb6d4a8-112c-41fa-b6f2-41b6d7882908\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.098052 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f526f38-b789-46a5-96c9-9d2c5a820e51-operator-scripts\") pod \"6f526f38-b789-46a5-96c9-9d2c5a820e51\" (UID: \"6f526f38-b789-46a5-96c9-9d2c5a820e51\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.099192 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.099221 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.099236 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6jkb\" (UniqueName: \"kubernetes.io/projected/4d342d2e-9afe-442f-9740-ba8f58e4f2b4-kube-api-access-q6jkb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.099253 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwbs4\" (UniqueName: \"kubernetes.io/projected/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b-kube-api-access-fwbs4\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.100935 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f526f38-b789-46a5-96c9-9d2c5a820e51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f526f38-b789-46a5-96c9-9d2c5a820e51" (UID: "6f526f38-b789-46a5-96c9-9d2c5a820e51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.101549 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bb6d4a8-112c-41fa-b6f2-41b6d7882908" (UID: "3bb6d4a8-112c-41fa-b6f2-41b6d7882908"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.109473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f526f38-b789-46a5-96c9-9d2c5a820e51-kube-api-access-2nbdm" (OuterVolumeSpecName: "kube-api-access-2nbdm") pod "6f526f38-b789-46a5-96c9-9d2c5a820e51" (UID: "6f526f38-b789-46a5-96c9-9d2c5a820e51"). InnerVolumeSpecName "kube-api-access-2nbdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.109545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-kube-api-access-jrk6f" (OuterVolumeSpecName: "kube-api-access-jrk6f") pod "3bb6d4a8-112c-41fa-b6f2-41b6d7882908" (UID: "3bb6d4a8-112c-41fa-b6f2-41b6d7882908"). InnerVolumeSpecName "kube-api-access-jrk6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.205750 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nbdm\" (UniqueName: \"kubernetes.io/projected/6f526f38-b789-46a5-96c9-9d2c5a820e51-kube-api-access-2nbdm\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.206348 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.206362 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f526f38-b789-46a5-96c9-9d2c5a820e51-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.206387 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrk6f\" (UniqueName: \"kubernetes.io/projected/3bb6d4a8-112c-41fa-b6f2-41b6d7882908-kube-api-access-jrk6f\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.253464 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.308040 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-ovsdbserver-sb\") pod \"3b4b3c48-2731-463d-a64b-bbdcae951e82\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.308250 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7g6h\" (UniqueName: \"kubernetes.io/projected/3b4b3c48-2731-463d-a64b-bbdcae951e82-kube-api-access-r7g6h\") pod \"3b4b3c48-2731-463d-a64b-bbdcae951e82\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.308335 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-dns-svc\") pod \"3b4b3c48-2731-463d-a64b-bbdcae951e82\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.308383 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-config\") pod \"3b4b3c48-2731-463d-a64b-bbdcae951e82\" (UID: \"3b4b3c48-2731-463d-a64b-bbdcae951e82\") " Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.343846 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4b3c48-2731-463d-a64b-bbdcae951e82-kube-api-access-r7g6h" (OuterVolumeSpecName: "kube-api-access-r7g6h") pod "3b4b3c48-2731-463d-a64b-bbdcae951e82" (UID: "3b4b3c48-2731-463d-a64b-bbdcae951e82"). InnerVolumeSpecName "kube-api-access-r7g6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.390797 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-config" (OuterVolumeSpecName: "config") pod "3b4b3c48-2731-463d-a64b-bbdcae951e82" (UID: "3b4b3c48-2731-463d-a64b-bbdcae951e82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.409548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b4b3c48-2731-463d-a64b-bbdcae951e82" (UID: "3b4b3c48-2731-463d-a64b-bbdcae951e82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.411465 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7g6h\" (UniqueName: \"kubernetes.io/projected/3b4b3c48-2731-463d-a64b-bbdcae951e82-kube-api-access-r7g6h\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.411490 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.411506 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.429594 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b4b3c48-2731-463d-a64b-bbdcae951e82" (UID: "3b4b3c48-2731-463d-a64b-bbdcae951e82"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.513421 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4b3c48-2731-463d-a64b-bbdcae951e82-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.576413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2wms" event={"ID":"39cc7bb1-7d0e-4efd-bf91-1774a5e3765b","Type":"ContainerDied","Data":"7c274c5023200c1124cedc71578492d6e1be21ea97937295b198ac86e9455206"} Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.576456 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c274c5023200c1124cedc71578492d6e1be21ea97937295b198ac86e9455206" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.576526 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2wms" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.581188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" event={"ID":"3b4b3c48-2731-463d-a64b-bbdcae951e82","Type":"ContainerDied","Data":"56eca9f76da1bf6a35f9039e9d499100a6a2331a8901031bf94d5dd73c38e5e6"} Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.581252 4792 scope.go:117] "RemoveContainer" containerID="ecdb27585784ead5aa6df245a5ca3aba0bad0e823ddb60af7cad011f11006b0d" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.581404 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-62hj5" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.588142 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7skkc" event={"ID":"6f526f38-b789-46a5-96c9-9d2c5a820e51","Type":"ContainerDied","Data":"28cf0505389e0e39f9c6d3d564ac1e25e79160161428da7b01b7d42f70dc389a"} Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.588176 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28cf0505389e0e39f9c6d3d564ac1e25e79160161428da7b01b7d42f70dc389a" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.588237 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7skkc" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.592658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" event={"ID":"3bb6d4a8-112c-41fa-b6f2-41b6d7882908","Type":"ContainerDied","Data":"e5e4402f5786fc41554ba50193adeadf59533ccc7f860ff4e7d63bfa98db2ffa"} Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.592718 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e4402f5786fc41554ba50193adeadf59533ccc7f860ff4e7d63bfa98db2ffa" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.592723 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e409-account-create-update-jmr8x" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.598489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d26-account-create-update-r7cpt" event={"ID":"4d342d2e-9afe-442f-9740-ba8f58e4f2b4","Type":"ContainerDied","Data":"a20675a875868b1779245bcde43cc0d32a25e128ae7383f3b977228e17ddfda2"} Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.598545 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20675a875868b1779245bcde43cc0d32a25e128ae7383f3b977228e17ddfda2" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.598750 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d26-account-create-update-r7cpt" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.633372 4792 scope.go:117] "RemoveContainer" containerID="71da5f56a6fbec3a48e0d7f3228a022e08687a6077e9db4a7d983aa42a8ab3ea" Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.694767 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-62hj5"] Mar 18 15:56:56 crc kubenswrapper[4792]: I0318 15:56:56.705274 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-62hj5"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.134827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.135344 4792 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.135358 4792 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.135407 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift podName:3c345496-7b4e-41f0-a5ae-4c503e452221 nodeName:}" failed. No retries permitted until 2026-03-18 15:57:01.135389008 +0000 UTC m=+1370.004717945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift") pod "swift-storage-0" (UID: "3c345496-7b4e-41f0-a5ae-4c503e452221") : configmap "swift-ring-files" not found Mar 18 15:56:57 crc kubenswrapper[4792]: W0318 15:56:57.157889 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5670c2b9_9a80_4670_a2d2_0135fbb5a77d.slice/crio-d61f27e594e0e8cc7ff8a734d3cbb93dde9492ca7c102ba164b184027e3344b6 WatchSource:0}: Error finding container d61f27e594e0e8cc7ff8a734d3cbb93dde9492ca7c102ba164b184027e3344b6: Status 404 returned error can't find the container with id d61f27e594e0e8cc7ff8a734d3cbb93dde9492ca7c102ba164b184027e3344b6 Mar 18 15:56:57 crc kubenswrapper[4792]: W0318 15:56:57.160411 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6be42760_b224_4ca3_8870_92131f90c77b.slice/crio-ee5359dd1a987c5631db8f0f246c65f8eb6c07490adfacbcfb0337b3e6c6f901 WatchSource:0}: Error finding container ee5359dd1a987c5631db8f0f246c65f8eb6c07490adfacbcfb0337b3e6c6f901: Status 404 returned error can't find the container with id ee5359dd1a987c5631db8f0f246c65f8eb6c07490adfacbcfb0337b3e6c6f901 Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.165342 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rks8d"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.180464 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-98a6-account-create-update-hg25j"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.206349 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f402-account-create-update-qxl8l"] Mar 18 15:56:57 crc kubenswrapper[4792]: W0318 15:56:57.210706 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1fbe7b_1a12_45d5_aa20_b56d3fad539f.slice/crio-bdcfa71fc956760b18930a10496346353de73ae90a84dec0834379edcd8b72e5 WatchSource:0}: Error finding container bdcfa71fc956760b18930a10496346353de73ae90a84dec0834379edcd8b72e5: Status 404 returned error can't find the container with id bdcfa71fc956760b18930a10496346353de73ae90a84dec0834379edcd8b72e5 Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.335275 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ft5cr"] Mar 18 15:56:57 crc kubenswrapper[4792]: W0318 15:56:57.344404 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fe6d817_db7d_4864_9cfb_1a399587c3b9.slice/crio-fea08a2195d90a0955d3d6e7535dc946881f089aa33d203a985506c022bf9504 WatchSource:0}: Error finding container fea08a2195d90a0955d3d6e7535dc946881f089aa33d203a985506c022bf9504: Status 404 returned error can't find the container with id fea08a2195d90a0955d3d6e7535dc946881f089aa33d203a985506c022bf9504 Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.357151 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n7whx"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.366047 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-fdrj5"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610220 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-tldpp"] Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.610633 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cc7bb1-7d0e-4efd-bf91-1774a5e3765b" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610645 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cc7bb1-7d0e-4efd-bf91-1774a5e3765b" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.610655 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d342d2e-9afe-442f-9740-ba8f58e4f2b4" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610661 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d342d2e-9afe-442f-9740-ba8f58e4f2b4" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.610681 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerName="init" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610686 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerName="init" Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.610698 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb6d4a8-112c-41fa-b6f2-41b6d7882908" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610703 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb6d4a8-112c-41fa-b6f2-41b6d7882908" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.610716 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f526f38-b789-46a5-96c9-9d2c5a820e51" containerName="mariadb-database-create" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610722 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f526f38-b789-46a5-96c9-9d2c5a820e51" containerName="mariadb-database-create" Mar 18 15:56:57 crc kubenswrapper[4792]: E0318 15:56:57.610737 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerName="dnsmasq-dns" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610742 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerName="dnsmasq-dns" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.610948 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d342d2e-9afe-442f-9740-ba8f58e4f2b4" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.611033 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" containerName="dnsmasq-dns" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.611055 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f526f38-b789-46a5-96c9-9d2c5a820e51" containerName="mariadb-database-create" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.611065 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb6d4a8-112c-41fa-b6f2-41b6d7882908" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.611076 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="39cc7bb1-7d0e-4efd-bf91-1774a5e3765b" containerName="mariadb-account-create-update" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.611852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.616483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-98a6-account-create-update-hg25j" event={"ID":"6be42760-b224-4ca3-8870-92131f90c77b","Type":"ContainerStarted","Data":"c486ec8b4994b185991d3be76c07402a4ffa86507c4323f1f2233014197038cb"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.616521 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-98a6-account-create-update-hg25j" event={"ID":"6be42760-b224-4ca3-8870-92131f90c77b","Type":"ContainerStarted","Data":"ee5359dd1a987c5631db8f0f246c65f8eb6c07490adfacbcfb0337b3e6c6f901"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.637302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7c13a8c4-d4ee-4af5-95bd-c28a60350d14","Type":"ContainerStarted","Data":"c9a1bd8b2a74a39b0cdf632978affda99cc29594906fe6e5faf5062f634fa907"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.637388 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7c13a8c4-d4ee-4af5-95bd-c28a60350d14","Type":"ContainerStarted","Data":"aab37fb5f189fd274f6763799b241afebc990cb15d21ccc791e6573ab1bd3243"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.637530 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.641582 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-tldpp"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.657696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerStarted","Data":"e313a088eafeb6957bebebf6888575180f7655f78c16d6183ae9dcb439f10ee5"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.667406 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f402-account-create-update-qxl8l" event={"ID":"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f","Type":"ContainerStarted","Data":"eb0e75a05b5ce80826ff0b1dcb9f596dffa9606205a0b26e894555066e9cb2d0"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.667456 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f402-account-create-update-qxl8l" event={"ID":"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f","Type":"ContainerStarted","Data":"bdcfa71fc956760b18930a10496346353de73ae90a84dec0834379edcd8b72e5"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.689280 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-98a6-account-create-update-hg25j" podStartSLOduration=7.689252416 podStartE2EDuration="7.689252416s" podCreationTimestamp="2026-03-18 15:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:57.652544306 +0000 UTC m=+1366.521873263" watchObservedRunningTime="2026-03-18 15:56:57.689252416 +0000 UTC m=+1366.558581353" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.697271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" event={"ID":"fa8c1204-320e-41c5-8393-c13f50febe7e","Type":"ContainerStarted","Data":"f3ba0cc2b9a066fe7a65566971942c7ef6a34c41be1a946df0ea4310d5c49c11"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.702433 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n7whx" event={"ID":"2156b56d-e393-4230-b0ae-057041cee710","Type":"ContainerStarted","Data":"cda88ae489406bf2b5b32c4d91e8c5b0d552c1b5a0af53431dac2c5430c426d1"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.704246 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rks8d" event={"ID":"5670c2b9-9a80-4670-a2d2-0135fbb5a77d","Type":"ContainerStarted","Data":"d61f27e594e0e8cc7ff8a734d3cbb93dde9492ca7c102ba164b184027e3344b6"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.706830 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.37607522 podStartE2EDuration="17.706808897s" podCreationTimestamp="2026-03-18 15:56:40 +0000 UTC" firstStartedPulling="2026-03-18 15:56:41.706689566 +0000 UTC m=+1350.576018503" lastFinishedPulling="2026-03-18 15:56:56.037423243 +0000 UTC m=+1364.906752180" observedRunningTime="2026-03-18 15:56:57.671050636 +0000 UTC m=+1366.540379573" watchObservedRunningTime="2026-03-18 15:56:57.706808897 +0000 UTC m=+1366.576137834" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.712449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ft5cr" event={"ID":"9fe6d817-db7d-4864-9cfb-1a399587c3b9","Type":"ContainerStarted","Data":"fea08a2195d90a0955d3d6e7535dc946881f089aa33d203a985506c022bf9504"} Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.714194 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f402-account-create-update-qxl8l" podStartSLOduration=7.714171483 podStartE2EDuration="7.714171483s" podCreationTimestamp="2026-03-18 15:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:57.686289103 +0000 UTC m=+1366.555618040" watchObservedRunningTime="2026-03-18 15:56:57.714171483 +0000 UTC m=+1366.583500420" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.739416 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-ft5cr" podStartSLOduration=7.739393808 podStartE2EDuration="7.739393808s" podCreationTimestamp="2026-03-18 15:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:57.738006283 +0000 UTC m=+1366.607335240" watchObservedRunningTime="2026-03-18 15:56:57.739393808 +0000 UTC m=+1366.608722745" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.749459 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a93f5d6c-7b86-47bd-952e-a1a563065c76-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-tldpp\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.749734 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pks\" (UniqueName: \"kubernetes.io/projected/a93f5d6c-7b86-47bd-952e-a1a563065c76-kube-api-access-c2pks\") pod \"mysqld-exporter-openstack-cell1-db-create-tldpp\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.852931 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pks\" (UniqueName: \"kubernetes.io/projected/a93f5d6c-7b86-47bd-952e-a1a563065c76-kube-api-access-c2pks\") pod \"mysqld-exporter-openstack-cell1-db-create-tldpp\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.853263 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a93f5d6c-7b86-47bd-952e-a1a563065c76-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-tldpp\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.855255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a93f5d6c-7b86-47bd-952e-a1a563065c76-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-tldpp\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.887648 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4b3c48-2731-463d-a64b-bbdcae951e82" path="/var/lib/kubelet/pods/3b4b3c48-2731-463d-a64b-bbdcae951e82/volumes" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.890167 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pks\" (UniqueName: \"kubernetes.io/projected/a93f5d6c-7b86-47bd-952e-a1a563065c76-kube-api-access-c2pks\") pod \"mysqld-exporter-openstack-cell1-db-create-tldpp\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.925334 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-p2wms"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.939414 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-p2wms"] Mar 18 15:56:57 crc kubenswrapper[4792]: I0318 15:56:57.950245 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.033317 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-7806-account-create-update-vflmx"] Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.034755 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.038691 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.044890 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-7806-account-create-update-vflmx"] Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.165727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thf6l\" (UniqueName: \"kubernetes.io/projected/c0402169-72db-460a-a8ae-0e6f8dbd696b-kube-api-access-thf6l\") pod \"mysqld-exporter-7806-account-create-update-vflmx\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.166181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0402169-72db-460a-a8ae-0e6f8dbd696b-operator-scripts\") pod \"mysqld-exporter-7806-account-create-update-vflmx\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.269247 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thf6l\" (UniqueName: \"kubernetes.io/projected/c0402169-72db-460a-a8ae-0e6f8dbd696b-kube-api-access-thf6l\") pod \"mysqld-exporter-7806-account-create-update-vflmx\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.269311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0402169-72db-460a-a8ae-0e6f8dbd696b-operator-scripts\") pod \"mysqld-exporter-7806-account-create-update-vflmx\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.270353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0402169-72db-460a-a8ae-0e6f8dbd696b-operator-scripts\") pod \"mysqld-exporter-7806-account-create-update-vflmx\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.292805 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thf6l\" (UniqueName: \"kubernetes.io/projected/c0402169-72db-460a-a8ae-0e6f8dbd696b-kube-api-access-thf6l\") pod \"mysqld-exporter-7806-account-create-update-vflmx\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.471240 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.497138 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-tldpp"] Mar 18 15:56:58 crc kubenswrapper[4792]: W0318 15:56:58.527295 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda93f5d6c_7b86_47bd_952e_a1a563065c76.slice/crio-42cb64e555ec906c245a3c19e02177fa5efaf6c49855866d214e03f654020bc7 WatchSource:0}: Error finding container 42cb64e555ec906c245a3c19e02177fa5efaf6c49855866d214e03f654020bc7: Status 404 returned error can't find the container with id 42cb64e555ec906c245a3c19e02177fa5efaf6c49855866d214e03f654020bc7 Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.731460 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" event={"ID":"a93f5d6c-7b86-47bd-952e-a1a563065c76","Type":"ContainerStarted","Data":"42cb64e555ec906c245a3c19e02177fa5efaf6c49855866d214e03f654020bc7"} Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.747440 4792 generic.go:334] "Generic (PLEG): container finished" podID="9fe6d817-db7d-4864-9cfb-1a399587c3b9" containerID="511db70907d804b39c2b2c8e66c764c42732087e3dfbececfac0290967a0684d" exitCode=0 Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.748096 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ft5cr" event={"ID":"9fe6d817-db7d-4864-9cfb-1a399587c3b9","Type":"ContainerDied","Data":"511db70907d804b39c2b2c8e66c764c42732087e3dfbececfac0290967a0684d"} Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.753637 4792 generic.go:334] "Generic (PLEG): container finished" podID="3b1fbe7b-1a12-45d5-aa20-b56d3fad539f" containerID="eb0e75a05b5ce80826ff0b1dcb9f596dffa9606205a0b26e894555066e9cb2d0" exitCode=0 Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.754045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f402-account-create-update-qxl8l" event={"ID":"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f","Type":"ContainerDied","Data":"eb0e75a05b5ce80826ff0b1dcb9f596dffa9606205a0b26e894555066e9cb2d0"} Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.756823 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerID="2bed1913e692f9d8e0ac2042a50544a1e56841327dc84a2607a5697896d758ab" exitCode=0 Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.757030 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" event={"ID":"fa8c1204-320e-41c5-8393-c13f50febe7e","Type":"ContainerDied","Data":"2bed1913e692f9d8e0ac2042a50544a1e56841327dc84a2607a5697896d758ab"} Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.768087 4792 generic.go:334] "Generic (PLEG): container finished" podID="2156b56d-e393-4230-b0ae-057041cee710" containerID="d4da398f7cccec42ba8c48cf52a1a5c57dfaf51bd19332afaf30cc84d2b62334" exitCode=0 Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.768218 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n7whx" event={"ID":"2156b56d-e393-4230-b0ae-057041cee710","Type":"ContainerDied","Data":"d4da398f7cccec42ba8c48cf52a1a5c57dfaf51bd19332afaf30cc84d2b62334"} Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.773443 4792 generic.go:334] "Generic (PLEG): container finished" podID="6be42760-b224-4ca3-8870-92131f90c77b" containerID="c486ec8b4994b185991d3be76c07402a4ffa86507c4323f1f2233014197038cb" exitCode=0 Mar 18 15:56:58 crc kubenswrapper[4792]: I0318 15:56:58.773642 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-98a6-account-create-update-hg25j" event={"ID":"6be42760-b224-4ca3-8870-92131f90c77b","Type":"ContainerDied","Data":"c486ec8b4994b185991d3be76c07402a4ffa86507c4323f1f2233014197038cb"} Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.348879 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-7806-account-create-update-vflmx"] Mar 18 15:56:59 crc kubenswrapper[4792]: W0318 15:56:59.352908 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0402169_72db_460a_a8ae_0e6f8dbd696b.slice/crio-1648c7b16744a99d99883a0929e4e129b625829c513b44897f99c0dee46af71a WatchSource:0}: Error finding container 1648c7b16744a99d99883a0929e4e129b625829c513b44897f99c0dee46af71a: Status 404 returned error can't find the container with id 1648c7b16744a99d99883a0929e4e129b625829c513b44897f99c0dee46af71a Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.571194 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-85f4b56fb6-xnb5g" podUID="b9bcba05-574e-478e-880d-0f46b4ed2052" containerName="console" containerID="cri-o://83ed938be43060ade26a09a9ab31173a73b86a81c312731f1f25b6a6781815d7" gracePeriod=15 Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.791279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" event={"ID":"fa8c1204-320e-41c5-8393-c13f50febe7e","Type":"ContainerStarted","Data":"86f3d861e016ac7a7173ab172f7cd23a4c1117667a6d366e1f4e198fa206b9fc"} Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.791387 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.794444 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85f4b56fb6-xnb5g_b9bcba05-574e-478e-880d-0f46b4ed2052/console/0.log" Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.794496 4792 generic.go:334] "Generic (PLEG): container finished" podID="b9bcba05-574e-478e-880d-0f46b4ed2052" containerID="83ed938be43060ade26a09a9ab31173a73b86a81c312731f1f25b6a6781815d7" exitCode=2 Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.794584 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f4b56fb6-xnb5g" event={"ID":"b9bcba05-574e-478e-880d-0f46b4ed2052","Type":"ContainerDied","Data":"83ed938be43060ade26a09a9ab31173a73b86a81c312731f1f25b6a6781815d7"} Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.798318 4792 generic.go:334] "Generic (PLEG): container finished" podID="a93f5d6c-7b86-47bd-952e-a1a563065c76" containerID="ca2d28e25db123a3fa0db97e3652bdd71a0e3252324bd76ee9d2b126206ac73b" exitCode=0 Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.798412 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" event={"ID":"a93f5d6c-7b86-47bd-952e-a1a563065c76","Type":"ContainerDied","Data":"ca2d28e25db123a3fa0db97e3652bdd71a0e3252324bd76ee9d2b126206ac73b"} Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.801525 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" event={"ID":"c0402169-72db-460a-a8ae-0e6f8dbd696b","Type":"ContainerStarted","Data":"63760b16c1843464a4d0e076f15cd2ffbf94c45592f7020559517429c3016f13"} Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.801738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" event={"ID":"c0402169-72db-460a-a8ae-0e6f8dbd696b","Type":"ContainerStarted","Data":"1648c7b16744a99d99883a0929e4e129b625829c513b44897f99c0dee46af71a"} Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.824296 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" podStartSLOduration=7.824262742 podStartE2EDuration="7.824262742s" podCreationTimestamp="2026-03-18 15:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:59.820379148 +0000 UTC m=+1368.689708085" watchObservedRunningTime="2026-03-18 15:56:59.824262742 +0000 UTC m=+1368.693591679" Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.875434 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39cc7bb1-7d0e-4efd-bf91-1774a5e3765b" path="/var/lib/kubelet/pods/39cc7bb1-7d0e-4efd-bf91-1774a5e3765b/volumes" Mar 18 15:56:59 crc kubenswrapper[4792]: I0318 15:56:59.880944 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" podStartSLOduration=2.880916331 podStartE2EDuration="2.880916331s" podCreationTimestamp="2026-03-18 15:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:56:59.861124389 +0000 UTC m=+1368.730453326" watchObservedRunningTime="2026-03-18 15:56:59.880916331 +0000 UTC m=+1368.750245268" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.005269 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mqtz6"] Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.006757 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.017613 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sw42f" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.017810 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.021530 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mqtz6"] Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.161874 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-config-data\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.162294 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-db-sync-config-data\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.162780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-combined-ca-bundle\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.162892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnkc6\" (UniqueName: \"kubernetes.io/projected/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-kube-api-access-gnkc6\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.276472 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-config-data\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.276564 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-db-sync-config-data\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.276652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-combined-ca-bundle\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.276746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnkc6\" (UniqueName: \"kubernetes.io/projected/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-kube-api-access-gnkc6\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.300305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnkc6\" (UniqueName: \"kubernetes.io/projected/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-kube-api-access-gnkc6\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.301157 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-combined-ca-bundle\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.302779 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-config-data\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.318617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-db-sync-config-data\") pod \"glance-db-sync-mqtz6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.336592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mqtz6" Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.815252 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0402169-72db-460a-a8ae-0e6f8dbd696b" containerID="63760b16c1843464a4d0e076f15cd2ffbf94c45592f7020559517429c3016f13" exitCode=0 Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.815325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" event={"ID":"c0402169-72db-460a-a8ae-0e6f8dbd696b","Type":"ContainerDied","Data":"63760b16c1843464a4d0e076f15cd2ffbf94c45592f7020559517429c3016f13"} Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.830862 4792 generic.go:334] "Generic (PLEG): container finished" podID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerID="7782a9ac2dbdefa63ef4731235464b9470b5ae66917c638938707289f3e36397" exitCode=0 Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.831103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"753d5ec4-134d-48f9-ad6c-aa17f8856b5a","Type":"ContainerDied","Data":"7782a9ac2dbdefa63ef4731235464b9470b5ae66917c638938707289f3e36397"} Mar 18 15:57:00 crc kubenswrapper[4792]: I0318 15:57:00.842105 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerStarted","Data":"4f0591bb4e037d2141fc349b99e7b56de5eaeeb03c840b837da1ffde0a26b88a"} Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.204898 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:57:01 crc kubenswrapper[4792]: E0318 15:57:01.205072 4792 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:57:01 crc kubenswrapper[4792]: E0318 15:57:01.205091 4792 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:57:01 crc kubenswrapper[4792]: E0318 15:57:01.205142 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift podName:3c345496-7b4e-41f0-a5ae-4c503e452221 nodeName:}" failed. No retries permitted until 2026-03-18 15:57:09.205126927 +0000 UTC m=+1378.074455864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift") pod "swift-storage-0" (UID: "3c345496-7b4e-41f0-a5ae-4c503e452221") : configmap "swift-ring-files" not found Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.381486 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.512080 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be42760-b224-4ca3-8870-92131f90c77b-operator-scripts\") pod \"6be42760-b224-4ca3-8870-92131f90c77b\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.512251 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntkk4\" (UniqueName: \"kubernetes.io/projected/6be42760-b224-4ca3-8870-92131f90c77b-kube-api-access-ntkk4\") pod \"6be42760-b224-4ca3-8870-92131f90c77b\" (UID: \"6be42760-b224-4ca3-8870-92131f90c77b\") " Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.513207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be42760-b224-4ca3-8870-92131f90c77b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6be42760-b224-4ca3-8870-92131f90c77b" (UID: "6be42760-b224-4ca3-8870-92131f90c77b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.513572 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6be42760-b224-4ca3-8870-92131f90c77b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.516434 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be42760-b224-4ca3-8870-92131f90c77b-kube-api-access-ntkk4" (OuterVolumeSpecName: "kube-api-access-ntkk4") pod "6be42760-b224-4ca3-8870-92131f90c77b" (UID: "6be42760-b224-4ca3-8870-92131f90c77b"). InnerVolumeSpecName "kube-api-access-ntkk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.579361 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wxk42"] Mar 18 15:57:01 crc kubenswrapper[4792]: E0318 15:57:01.579951 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be42760-b224-4ca3-8870-92131f90c77b" containerName="mariadb-account-create-update" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.579992 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be42760-b224-4ca3-8870-92131f90c77b" containerName="mariadb-account-create-update" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.580201 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be42760-b224-4ca3-8870-92131f90c77b" containerName="mariadb-account-create-update" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.581097 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.583757 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.596572 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wxk42"] Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.616511 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntkk4\" (UniqueName: \"kubernetes.io/projected/6be42760-b224-4ca3-8870-92131f90c77b-kube-api-access-ntkk4\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.719143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b2e810-4086-42c8-824b-5f9ad416e639-operator-scripts\") pod \"root-account-create-update-wxk42\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.719698 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kbh\" (UniqueName: \"kubernetes.io/projected/77b2e810-4086-42c8-824b-5f9ad416e639-kube-api-access-k2kbh\") pod \"root-account-create-update-wxk42\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.822159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kbh\" (UniqueName: \"kubernetes.io/projected/77b2e810-4086-42c8-824b-5f9ad416e639-kube-api-access-k2kbh\") pod \"root-account-create-update-wxk42\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.823022 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b2e810-4086-42c8-824b-5f9ad416e639-operator-scripts\") pod \"root-account-create-update-wxk42\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.824070 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b2e810-4086-42c8-824b-5f9ad416e639-operator-scripts\") pod \"root-account-create-update-wxk42\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.844001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kbh\" (UniqueName: \"kubernetes.io/projected/77b2e810-4086-42c8-824b-5f9ad416e639-kube-api-access-k2kbh\") pod \"root-account-create-update-wxk42\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.865739 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-98a6-account-create-update-hg25j" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.883943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-98a6-account-create-update-hg25j" event={"ID":"6be42760-b224-4ca3-8870-92131f90c77b","Type":"ContainerDied","Data":"ee5359dd1a987c5631db8f0f246c65f8eb6c07490adfacbcfb0337b3e6c6f901"} Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.884289 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5359dd1a987c5631db8f0f246c65f8eb6c07490adfacbcfb0337b3e6c6f901" Mar 18 15:57:01 crc kubenswrapper[4792]: I0318 15:57:01.899487 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:02 crc kubenswrapper[4792]: I0318 15:57:02.104613 4792 patch_prober.go:28] interesting pod/console-85f4b56fb6-xnb5g container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.93:8443/health\": dial tcp 10.217.0.93:8443: connect: connection refused" start-of-body= Mar 18 15:57:02 crc kubenswrapper[4792]: I0318 15:57:02.104661 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-85f4b56fb6-xnb5g" podUID="b9bcba05-574e-478e-880d-0f46b4ed2052" containerName="console" probeResult="failure" output="Get \"https://10.217.0.93:8443/health\": dial tcp 10.217.0.93:8443: connect: connection refused" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.333994 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ft5cr" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.338827 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.339182 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.423437 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.455943 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7whx" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.469050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm7jj\" (UniqueName: \"kubernetes.io/projected/9fe6d817-db7d-4864-9cfb-1a399587c3b9-kube-api-access-xm7jj\") pod \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.469175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thf6l\" (UniqueName: \"kubernetes.io/projected/c0402169-72db-460a-a8ae-0e6f8dbd696b-kube-api-access-thf6l\") pod \"c0402169-72db-460a-a8ae-0e6f8dbd696b\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.471099 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0402169-72db-460a-a8ae-0e6f8dbd696b-operator-scripts\") pod \"c0402169-72db-460a-a8ae-0e6f8dbd696b\" (UID: \"c0402169-72db-460a-a8ae-0e6f8dbd696b\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.471139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fe6d817-db7d-4864-9cfb-1a399587c3b9-operator-scripts\") pod \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\" (UID: \"9fe6d817-db7d-4864-9cfb-1a399587c3b9\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.471215 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2pks\" (UniqueName: \"kubernetes.io/projected/a93f5d6c-7b86-47bd-952e-a1a563065c76-kube-api-access-c2pks\") pod \"a93f5d6c-7b86-47bd-952e-a1a563065c76\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.471347 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-operator-scripts\") pod \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.471372 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdzh5\" (UniqueName: \"kubernetes.io/projected/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-kube-api-access-jdzh5\") pod \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\" (UID: \"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.471581 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a93f5d6c-7b86-47bd-952e-a1a563065c76-operator-scripts\") pod \"a93f5d6c-7b86-47bd-952e-a1a563065c76\" (UID: \"a93f5d6c-7b86-47bd-952e-a1a563065c76\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.473087 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b1fbe7b-1a12-45d5-aa20-b56d3fad539f" (UID: "3b1fbe7b-1a12-45d5-aa20-b56d3fad539f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.473888 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.473998 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fe6d817-db7d-4864-9cfb-1a399587c3b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fe6d817-db7d-4864-9cfb-1a399587c3b9" (UID: "9fe6d817-db7d-4864-9cfb-1a399587c3b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.474457 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0402169-72db-460a-a8ae-0e6f8dbd696b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0402169-72db-460a-a8ae-0e6f8dbd696b" (UID: "c0402169-72db-460a-a8ae-0e6f8dbd696b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.474484 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a93f5d6c-7b86-47bd-952e-a1a563065c76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a93f5d6c-7b86-47bd-952e-a1a563065c76" (UID: "a93f5d6c-7b86-47bd-952e-a1a563065c76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.481008 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85f4b56fb6-xnb5g_b9bcba05-574e-478e-880d-0f46b4ed2052/console/0.log" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.481143 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.483493 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93f5d6c-7b86-47bd-952e-a1a563065c76-kube-api-access-c2pks" (OuterVolumeSpecName: "kube-api-access-c2pks") pod "a93f5d6c-7b86-47bd-952e-a1a563065c76" (UID: "a93f5d6c-7b86-47bd-952e-a1a563065c76"). InnerVolumeSpecName "kube-api-access-c2pks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.485470 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-kube-api-access-jdzh5" (OuterVolumeSpecName: "kube-api-access-jdzh5") pod "3b1fbe7b-1a12-45d5-aa20-b56d3fad539f" (UID: "3b1fbe7b-1a12-45d5-aa20-b56d3fad539f"). InnerVolumeSpecName "kube-api-access-jdzh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.486565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fe6d817-db7d-4864-9cfb-1a399587c3b9-kube-api-access-xm7jj" (OuterVolumeSpecName: "kube-api-access-xm7jj") pod "9fe6d817-db7d-4864-9cfb-1a399587c3b9" (UID: "9fe6d817-db7d-4864-9cfb-1a399587c3b9"). InnerVolumeSpecName "kube-api-access-xm7jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.488800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0402169-72db-460a-a8ae-0e6f8dbd696b-kube-api-access-thf6l" (OuterVolumeSpecName: "kube-api-access-thf6l") pod "c0402169-72db-460a-a8ae-0e6f8dbd696b" (UID: "c0402169-72db-460a-a8ae-0e6f8dbd696b"). InnerVolumeSpecName "kube-api-access-thf6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.576920 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-console-config\") pod \"b9bcba05-574e-478e-880d-0f46b4ed2052\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577057 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-serving-cert\") pod \"b9bcba05-574e-478e-880d-0f46b4ed2052\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577107 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-trusted-ca-bundle\") pod \"b9bcba05-574e-478e-880d-0f46b4ed2052\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577133 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xzvd\" (UniqueName: \"kubernetes.io/projected/b9bcba05-574e-478e-880d-0f46b4ed2052-kube-api-access-6xzvd\") pod \"b9bcba05-574e-478e-880d-0f46b4ed2052\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577168 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-oauth-serving-cert\") pod \"b9bcba05-574e-478e-880d-0f46b4ed2052\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577262 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7jch\" (UniqueName: \"kubernetes.io/projected/2156b56d-e393-4230-b0ae-057041cee710-kube-api-access-k7jch\") pod \"2156b56d-e393-4230-b0ae-057041cee710\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-service-ca\") pod \"b9bcba05-574e-478e-880d-0f46b4ed2052\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577349 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2156b56d-e393-4230-b0ae-057041cee710-operator-scripts\") pod \"2156b56d-e393-4230-b0ae-057041cee710\" (UID: \"2156b56d-e393-4230-b0ae-057041cee710\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.577486 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-oauth-config\") pod \"b9bcba05-574e-478e-880d-0f46b4ed2052\" (UID: \"b9bcba05-574e-478e-880d-0f46b4ed2052\") " Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.578440 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0402169-72db-460a-a8ae-0e6f8dbd696b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.578463 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fe6d817-db7d-4864-9cfb-1a399587c3b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.578474 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2pks\" (UniqueName: \"kubernetes.io/projected/a93f5d6c-7b86-47bd-952e-a1a563065c76-kube-api-access-c2pks\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.578488 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdzh5\" (UniqueName: \"kubernetes.io/projected/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f-kube-api-access-jdzh5\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.578500 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a93f5d6c-7b86-47bd-952e-a1a563065c76-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.578511 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm7jj\" (UniqueName: \"kubernetes.io/projected/9fe6d817-db7d-4864-9cfb-1a399587c3b9-kube-api-access-xm7jj\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.578522 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thf6l\" (UniqueName: \"kubernetes.io/projected/c0402169-72db-460a-a8ae-0e6f8dbd696b-kube-api-access-thf6l\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.579564 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b9bcba05-574e-478e-880d-0f46b4ed2052" (UID: "b9bcba05-574e-478e-880d-0f46b4ed2052"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.580267 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b9bcba05-574e-478e-880d-0f46b4ed2052" (UID: "b9bcba05-574e-478e-880d-0f46b4ed2052"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.582084 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b9bcba05-574e-478e-880d-0f46b4ed2052" (UID: "b9bcba05-574e-478e-880d-0f46b4ed2052"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.582133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-console-config" (OuterVolumeSpecName: "console-config") pod "b9bcba05-574e-478e-880d-0f46b4ed2052" (UID: "b9bcba05-574e-478e-880d-0f46b4ed2052"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.582571 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-service-ca" (OuterVolumeSpecName: "service-ca") pod "b9bcba05-574e-478e-880d-0f46b4ed2052" (UID: "b9bcba05-574e-478e-880d-0f46b4ed2052"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.583927 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2156b56d-e393-4230-b0ae-057041cee710-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2156b56d-e393-4230-b0ae-057041cee710" (UID: "2156b56d-e393-4230-b0ae-057041cee710"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.587336 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bcba05-574e-478e-880d-0f46b4ed2052-kube-api-access-6xzvd" (OuterVolumeSpecName: "kube-api-access-6xzvd") pod "b9bcba05-574e-478e-880d-0f46b4ed2052" (UID: "b9bcba05-574e-478e-880d-0f46b4ed2052"). InnerVolumeSpecName "kube-api-access-6xzvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.590842 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b9bcba05-574e-478e-880d-0f46b4ed2052" (UID: "b9bcba05-574e-478e-880d-0f46b4ed2052"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.598153 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2156b56d-e393-4230-b0ae-057041cee710-kube-api-access-k7jch" (OuterVolumeSpecName: "kube-api-access-k7jch") pod "2156b56d-e393-4230-b0ae-057041cee710" (UID: "2156b56d-e393-4230-b0ae-057041cee710"). InnerVolumeSpecName "kube-api-access-k7jch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680421 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680469 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680483 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xzvd\" (UniqueName: \"kubernetes.io/projected/b9bcba05-574e-478e-880d-0f46b4ed2052-kube-api-access-6xzvd\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680498 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680512 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7jch\" (UniqueName: \"kubernetes.io/projected/2156b56d-e393-4230-b0ae-057041cee710-kube-api-access-k7jch\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680524 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680536 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2156b56d-e393-4230-b0ae-057041cee710-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680553 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9bcba05-574e-478e-880d-0f46b4ed2052-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.680566 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9bcba05-574e-478e-880d-0f46b4ed2052-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.892034 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wxk42"] Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.896220 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n7whx" event={"ID":"2156b56d-e393-4230-b0ae-057041cee710","Type":"ContainerDied","Data":"cda88ae489406bf2b5b32c4d91e8c5b0d552c1b5a0af53431dac2c5430c426d1"} Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.896282 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cda88ae489406bf2b5b32c4d91e8c5b0d552c1b5a0af53431dac2c5430c426d1" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.896286 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7whx" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.900876 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85f4b56fb6-xnb5g_b9bcba05-574e-478e-880d-0f46b4ed2052/console/0.log" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.900985 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f4b56fb6-xnb5g" event={"ID":"b9bcba05-574e-478e-880d-0f46b4ed2052","Type":"ContainerDied","Data":"2dccd64441942b3b6ab4baa8b27e5b93299c3d7e4a9cdda548e1f5b02f5e8346"} Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.901026 4792 scope.go:117] "RemoveContainer" containerID="83ed938be43060ade26a09a9ab31173a73b86a81c312731f1f25b6a6781815d7" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.901211 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f4b56fb6-xnb5g" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.905383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" event={"ID":"a93f5d6c-7b86-47bd-952e-a1a563065c76","Type":"ContainerDied","Data":"42cb64e555ec906c245a3c19e02177fa5efaf6c49855866d214e03f654020bc7"} Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.905436 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42cb64e555ec906c245a3c19e02177fa5efaf6c49855866d214e03f654020bc7" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.905434 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-tldpp" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.912735 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ft5cr" event={"ID":"9fe6d817-db7d-4864-9cfb-1a399587c3b9","Type":"ContainerDied","Data":"fea08a2195d90a0955d3d6e7535dc946881f089aa33d203a985506c022bf9504"} Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.913306 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fea08a2195d90a0955d3d6e7535dc946881f089aa33d203a985506c022bf9504" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.913044 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ft5cr" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.937442 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"753d5ec4-134d-48f9-ad6c-aa17f8856b5a","Type":"ContainerStarted","Data":"4039bd52f24043dbbc782ccbdcbb41fe807871286bbdd4f16a6108b39598110c"} Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.938842 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.940931 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f402-account-create-update-qxl8l" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.941019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f402-account-create-update-qxl8l" event={"ID":"3b1fbe7b-1a12-45d5-aa20-b56d3fad539f","Type":"ContainerDied","Data":"bdcfa71fc956760b18930a10496346353de73ae90a84dec0834379edcd8b72e5"} Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.941381 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdcfa71fc956760b18930a10496346353de73ae90a84dec0834379edcd8b72e5" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.945364 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.945067 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-7806-account-create-update-vflmx" event={"ID":"c0402169-72db-460a-a8ae-0e6f8dbd696b","Type":"ContainerDied","Data":"1648c7b16744a99d99883a0929e4e129b625829c513b44897f99c0dee46af71a"} Mar 18 15:57:03 crc kubenswrapper[4792]: I0318 15:57:03.945811 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1648c7b16744a99d99883a0929e4e129b625829c513b44897f99c0dee46af71a" Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.001393 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=41.34867128 podStartE2EDuration="1m9.001360657s" podCreationTimestamp="2026-03-18 15:55:55 +0000 UTC" firstStartedPulling="2026-03-18 15:55:58.00001282 +0000 UTC m=+1306.869341757" lastFinishedPulling="2026-03-18 15:56:25.652702197 +0000 UTC m=+1334.522031134" observedRunningTime="2026-03-18 15:57:03.969154548 +0000 UTC m=+1372.838483515" watchObservedRunningTime="2026-03-18 15:57:04.001360657 +0000 UTC m=+1372.870689594" Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.042622 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mqtz6"] Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.217926 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85f4b56fb6-xnb5g"] Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.230851 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85f4b56fb6-xnb5g"] Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.715961 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m977k" podUID="b90ccac6-a973-4572-834a-f7215cfc72a7" containerName="ovn-controller" probeResult="failure" output=< Mar 18 15:57:04 crc kubenswrapper[4792]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 15:57:04 crc kubenswrapper[4792]: > Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.966006 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mqtz6" event={"ID":"bc4c925f-3e43-4351-9703-7fdd44a1a9d6","Type":"ContainerStarted","Data":"d1ddb043f889189a4edf98c3e6cac2948dec5ae5f90dbd0c9bf52b6192251963"} Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.969507 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rks8d" event={"ID":"5670c2b9-9a80-4670-a2d2-0135fbb5a77d","Type":"ContainerStarted","Data":"9aebd5d02c7c9b02bb4aba9a8be3b0d9f5ce8e4d3ebf7d71c7b27f99ffb7c783"} Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.979386 4792 generic.go:334] "Generic (PLEG): container finished" podID="77b2e810-4086-42c8-824b-5f9ad416e639" containerID="4e1e12215aecbdfd4ece57b53e97d16d651e67fb481d33da31e7a578f9f6f5d5" exitCode=0 Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.979588 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxk42" event={"ID":"77b2e810-4086-42c8-824b-5f9ad416e639","Type":"ContainerDied","Data":"4e1e12215aecbdfd4ece57b53e97d16d651e67fb481d33da31e7a578f9f6f5d5"} Mar 18 15:57:04 crc kubenswrapper[4792]: I0318 15:57:04.979715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxk42" event={"ID":"77b2e810-4086-42c8-824b-5f9ad416e639","Type":"ContainerStarted","Data":"9f6b89d5b45e8b1bec95107523cb50c1407e42e15882b1a057f9e0a2e9e90409"} Mar 18 15:57:05 crc kubenswrapper[4792]: I0318 15:57:05.010161 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rks8d" podStartSLOduration=5.992987279 podStartE2EDuration="12.010125484s" podCreationTimestamp="2026-03-18 15:56:53 +0000 UTC" firstStartedPulling="2026-03-18 15:56:57.161031777 +0000 UTC m=+1366.030360714" lastFinishedPulling="2026-03-18 15:57:03.178169982 +0000 UTC m=+1372.047498919" observedRunningTime="2026-03-18 15:57:04.995771306 +0000 UTC m=+1373.865100243" watchObservedRunningTime="2026-03-18 15:57:05.010125484 +0000 UTC m=+1373.879454421" Mar 18 15:57:05 crc kubenswrapper[4792]: I0318 15:57:05.866085 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bcba05-574e-478e-880d-0f46b4ed2052" path="/var/lib/kubelet/pods/b9bcba05-574e-478e-880d-0f46b4ed2052/volumes" Mar 18 15:57:05 crc kubenswrapper[4792]: I0318 15:57:05.992848 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerID="473056b2f961908787d6d0bbd7279324cef361cac398147642946a887e53c916" exitCode=0 Mar 18 15:57:05 crc kubenswrapper[4792]: I0318 15:57:05.994494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1","Type":"ContainerDied","Data":"473056b2f961908787d6d0bbd7279324cef361cac398147642946a887e53c916"} Mar 18 15:57:06 crc kubenswrapper[4792]: I0318 15:57:06.518335 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:06 crc kubenswrapper[4792]: I0318 15:57:06.580222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2kbh\" (UniqueName: \"kubernetes.io/projected/77b2e810-4086-42c8-824b-5f9ad416e639-kube-api-access-k2kbh\") pod \"77b2e810-4086-42c8-824b-5f9ad416e639\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " Mar 18 15:57:06 crc kubenswrapper[4792]: I0318 15:57:06.580483 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b2e810-4086-42c8-824b-5f9ad416e639-operator-scripts\") pod \"77b2e810-4086-42c8-824b-5f9ad416e639\" (UID: \"77b2e810-4086-42c8-824b-5f9ad416e639\") " Mar 18 15:57:06 crc kubenswrapper[4792]: I0318 15:57:06.581599 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77b2e810-4086-42c8-824b-5f9ad416e639-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77b2e810-4086-42c8-824b-5f9ad416e639" (UID: "77b2e810-4086-42c8-824b-5f9ad416e639"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4792]: I0318 15:57:06.586917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b2e810-4086-42c8-824b-5f9ad416e639-kube-api-access-k2kbh" (OuterVolumeSpecName: "kube-api-access-k2kbh") pod "77b2e810-4086-42c8-824b-5f9ad416e639" (UID: "77b2e810-4086-42c8-824b-5f9ad416e639"). InnerVolumeSpecName "kube-api-access-k2kbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:06 crc kubenswrapper[4792]: I0318 15:57:06.684154 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77b2e810-4086-42c8-824b-5f9ad416e639-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:06 crc kubenswrapper[4792]: I0318 15:57:06.684585 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2kbh\" (UniqueName: \"kubernetes.io/projected/77b2e810-4086-42c8-824b-5f9ad416e639-kube-api-access-k2kbh\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.041428 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerID="f74bb4cde5f8d03c3be1828e98466fefc661c6bd28ec4a480e92e59898e2c8fa" exitCode=0 Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.041597 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0987841-aa1a-4130-a8e9-aeab1ba7aade","Type":"ContainerDied","Data":"f74bb4cde5f8d03c3be1828e98466fefc661c6bd28ec4a480e92e59898e2c8fa"} Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.060587 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxk42" event={"ID":"77b2e810-4086-42c8-824b-5f9ad416e639","Type":"ContainerDied","Data":"9f6b89d5b45e8b1bec95107523cb50c1407e42e15882b1a057f9e0a2e9e90409"} Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.060635 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6b89d5b45e8b1bec95107523cb50c1407e42e15882b1a057f9e0a2e9e90409" Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.060698 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxk42" Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.397989 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.465227 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ghqxp"] Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.465500 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-ghqxp" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" containerName="dnsmasq-dns" containerID="cri-o://1a49425564367c011a4dc8f6f02fb109cfcd29cda98c9a5d255c481d60111d94" gracePeriod=10 Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.923066 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wxk42"] Mar 18 15:57:07 crc kubenswrapper[4792]: I0318 15:57:07.935899 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wxk42"] Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.081929 4792 generic.go:334] "Generic (PLEG): container finished" podID="8e546b43-e259-4d0e-81f8-2381371157bd" containerID="1a49425564367c011a4dc8f6f02fb109cfcd29cda98c9a5d255c481d60111d94" exitCode=0 Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.082008 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ghqxp" event={"ID":"8e546b43-e259-4d0e-81f8-2381371157bd","Type":"ContainerDied","Data":"1a49425564367c011a4dc8f6f02fb109cfcd29cda98c9a5d255c481d60111d94"} Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.090090 4792 generic.go:334] "Generic (PLEG): container finished" podID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerID="8ca1e9174e1b68cb3a500c36721db12755c75144ada52226620997217d432a3c" exitCode=0 Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.090183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"be818cb3-6cf1-4945-a96e-25c124ed1098","Type":"ContainerDied","Data":"8ca1e9174e1b68cb3a500c36721db12755c75144ada52226620997217d432a3c"} Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.097954 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1","Type":"ContainerStarted","Data":"a3bc29f7139dbf87b29ddc94cff1982a1d93cede423621e24bd609c3d2e251e1"} Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.098316 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156251 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 15:57:08 crc kubenswrapper[4792]: E0318 15:57:08.156721 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bcba05-574e-478e-880d-0f46b4ed2052" containerName="console" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156734 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bcba05-574e-478e-880d-0f46b4ed2052" containerName="console" Mar 18 15:57:08 crc kubenswrapper[4792]: E0318 15:57:08.156746 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe6d817-db7d-4864-9cfb-1a399587c3b9" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156753 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe6d817-db7d-4864-9cfb-1a399587c3b9" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: E0318 15:57:08.156768 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93f5d6c-7b86-47bd-952e-a1a563065c76" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156775 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93f5d6c-7b86-47bd-952e-a1a563065c76" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: E0318 15:57:08.156787 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2156b56d-e393-4230-b0ae-057041cee710" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156793 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2156b56d-e393-4230-b0ae-057041cee710" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: E0318 15:57:08.156820 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1fbe7b-1a12-45d5-aa20-b56d3fad539f" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156826 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1fbe7b-1a12-45d5-aa20-b56d3fad539f" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: E0318 15:57:08.156839 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b2e810-4086-42c8-824b-5f9ad416e639" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156845 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b2e810-4086-42c8-824b-5f9ad416e639" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: E0318 15:57:08.156858 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0402169-72db-460a-a8ae-0e6f8dbd696b" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.156864 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0402169-72db-460a-a8ae-0e6f8dbd696b" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157062 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1fbe7b-1a12-45d5-aa20-b56d3fad539f" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157076 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0402169-72db-460a-a8ae-0e6f8dbd696b" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157092 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fe6d817-db7d-4864-9cfb-1a399587c3b9" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157102 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bcba05-574e-478e-880d-0f46b4ed2052" containerName="console" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157112 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="77b2e810-4086-42c8-824b-5f9ad416e639" containerName="mariadb-account-create-update" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157122 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2156b56d-e393-4230-b0ae-057041cee710" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157134 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93f5d6c-7b86-47bd-952e-a1a563065c76" containerName="mariadb-database-create" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.157857 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.162242 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.193110 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.200451 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371963.654348 podStartE2EDuration="1m13.200426981s" podCreationTimestamp="2026-03-18 15:55:55 +0000 UTC" firstStartedPulling="2026-03-18 15:55:57.563601841 +0000 UTC m=+1306.432930788" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:08.185919928 +0000 UTC m=+1377.055248865" watchObservedRunningTime="2026-03-18 15:57:08.200426981 +0000 UTC m=+1377.069755918" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.347298 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-485fc\" (UniqueName: \"kubernetes.io/projected/269b9513-d2bb-4890-b0fc-11271ef50754-kube-api-access-485fc\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.347662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.347689 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-config-data\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.450704 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-485fc\" (UniqueName: \"kubernetes.io/projected/269b9513-d2bb-4890-b0fc-11271ef50754-kube-api-access-485fc\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.450794 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.450830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-config-data\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.458386 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-config-data\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.474564 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.474999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-485fc\" (UniqueName: \"kubernetes.io/projected/269b9513-d2bb-4890-b0fc-11271ef50754-kube-api-access-485fc\") pod \"mysqld-exporter-0\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " pod="openstack/mysqld-exporter-0" Mar 18 15:57:08 crc kubenswrapper[4792]: I0318 15:57:08.500784 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 15:57:09 crc kubenswrapper[4792]: I0318 15:57:09.275366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:57:09 crc kubenswrapper[4792]: E0318 15:57:09.276365 4792 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 15:57:09 crc kubenswrapper[4792]: E0318 15:57:09.278756 4792 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 15:57:09 crc kubenswrapper[4792]: E0318 15:57:09.279442 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift podName:3c345496-7b4e-41f0-a5ae-4c503e452221 nodeName:}" failed. No retries permitted until 2026-03-18 15:57:25.27941306 +0000 UTC m=+1394.148742157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift") pod "swift-storage-0" (UID: "3c345496-7b4e-41f0-a5ae-4c503e452221") : configmap "swift-ring-files" not found Mar 18 15:57:09 crc kubenswrapper[4792]: I0318 15:57:09.906233 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77b2e810-4086-42c8-824b-5f9ad416e639" path="/var/lib/kubelet/pods/77b2e810-4086-42c8-824b-5f9ad416e639/volumes" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.001900 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.100267 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m977k" podUID="b90ccac6-a973-4572-834a-f7215cfc72a7" containerName="ovn-controller" probeResult="failure" output=< Mar 18 15:57:10 crc kubenswrapper[4792]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 15:57:10 crc kubenswrapper[4792]: > Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.220385 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6xllm" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.274159 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.414405 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-dns-svc\") pod \"8e546b43-e259-4d0e-81f8-2381371157bd\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.414472 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-config\") pod \"8e546b43-e259-4d0e-81f8-2381371157bd\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.414575 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-nb\") pod \"8e546b43-e259-4d0e-81f8-2381371157bd\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.414608 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-sb\") pod \"8e546b43-e259-4d0e-81f8-2381371157bd\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.414635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2xqs\" (UniqueName: \"kubernetes.io/projected/8e546b43-e259-4d0e-81f8-2381371157bd-kube-api-access-v2xqs\") pod \"8e546b43-e259-4d0e-81f8-2381371157bd\" (UID: \"8e546b43-e259-4d0e-81f8-2381371157bd\") " Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.431159 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e546b43-e259-4d0e-81f8-2381371157bd-kube-api-access-v2xqs" (OuterVolumeSpecName: "kube-api-access-v2xqs") pod "8e546b43-e259-4d0e-81f8-2381371157bd" (UID: "8e546b43-e259-4d0e-81f8-2381371157bd"). InnerVolumeSpecName "kube-api-access-v2xqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.471311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e546b43-e259-4d0e-81f8-2381371157bd" (UID: "8e546b43-e259-4d0e-81f8-2381371157bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.519386 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.519440 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2xqs\" (UniqueName: \"kubernetes.io/projected/8e546b43-e259-4d0e-81f8-2381371157bd-kube-api-access-v2xqs\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.519655 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e546b43-e259-4d0e-81f8-2381371157bd" (UID: "8e546b43-e259-4d0e-81f8-2381371157bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.535702 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-config" (OuterVolumeSpecName: "config") pod "8e546b43-e259-4d0e-81f8-2381371157bd" (UID: "8e546b43-e259-4d0e-81f8-2381371157bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.552772 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e546b43-e259-4d0e-81f8-2381371157bd" (UID: "8e546b43-e259-4d0e-81f8-2381371157bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.559006 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.621238 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.621276 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.621290 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e546b43-e259-4d0e-81f8-2381371157bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.643865 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m977k-config-b6v8z"] Mar 18 15:57:10 crc kubenswrapper[4792]: E0318 15:57:10.644343 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" containerName="init" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.644361 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" containerName="init" Mar 18 15:57:10 crc kubenswrapper[4792]: E0318 15:57:10.644382 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" containerName="dnsmasq-dns" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.644389 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" containerName="dnsmasq-dns" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.644604 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" containerName="dnsmasq-dns" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.645330 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.647089 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.660319 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m977k-config-b6v8z"] Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.723746 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-log-ovn\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.723834 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h252f\" (UniqueName: \"kubernetes.io/projected/de52bdb9-b29a-4551-aa15-0a43e2b88909-kube-api-access-h252f\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.723921 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run-ovn\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.724062 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.724096 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-additional-scripts\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.724155 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-scripts\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826238 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-additional-scripts\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-scripts\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826412 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-log-ovn\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826454 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h252f\" (UniqueName: \"kubernetes.io/projected/de52bdb9-b29a-4551-aa15-0a43e2b88909-kube-api-access-h252f\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run-ovn\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run-ovn\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826802 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.826837 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-log-ovn\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.827688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-additional-scripts\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.829613 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-scripts\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.851840 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h252f\" (UniqueName: \"kubernetes.io/projected/de52bdb9-b29a-4551-aa15-0a43e2b88909-kube-api-access-h252f\") pod \"ovn-controller-m977k-config-b6v8z\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:10 crc kubenswrapper[4792]: I0318 15:57:10.985749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.153200 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ghqxp" event={"ID":"8e546b43-e259-4d0e-81f8-2381371157bd","Type":"ContainerDied","Data":"dc34ece1e0e033647b11168a027ebef8add535ef32b06c45b8edc927fcd7e3a4"} Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.153492 4792 scope.go:117] "RemoveContainer" containerID="1a49425564367c011a4dc8f6f02fb109cfcd29cda98c9a5d255c481d60111d94" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.153664 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ghqxp" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.170542 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerStarted","Data":"c8f115ea38945674ba812fd56982b41e65a502103e68f5239b68b21bba217f9e"} Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.175355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"be818cb3-6cf1-4945-a96e-25c124ed1098","Type":"ContainerStarted","Data":"0f0e8ea90dacfdde716a4095e4ec74aed9c59ca9aa529d0bf412ff84ee2c9f76"} Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.176874 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.181581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"269b9513-d2bb-4890-b0fc-11271ef50754","Type":"ContainerStarted","Data":"064f4a99970a118a2284b468f6d1a27ceb8ad67362ab9902aa98ba5136664964"} Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.194990 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0987841-aa1a-4130-a8e9-aeab1ba7aade","Type":"ContainerStarted","Data":"a73fb5d896401b251fb4a68c23fce2bf038ce7ca13f630624079db0959990df1"} Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.195583 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.214778 4792 scope.go:117] "RemoveContainer" containerID="9f7cb884f537d7764649e668e844c489199bd103b4ae8636dc9c6675e3118b23" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.227568 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=24.969246101 podStartE2EDuration="1m10.227539981s" podCreationTimestamp="2026-03-18 15:56:01 +0000 UTC" firstStartedPulling="2026-03-18 15:56:24.369126918 +0000 UTC m=+1333.238455855" lastFinishedPulling="2026-03-18 15:57:09.627420788 +0000 UTC m=+1378.496749735" observedRunningTime="2026-03-18 15:57:11.216596212 +0000 UTC m=+1380.085925159" watchObservedRunningTime="2026-03-18 15:57:11.227539981 +0000 UTC m=+1380.096868918" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.277518 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=-9223371960.57728 podStartE2EDuration="1m16.277495735s" podCreationTimestamp="2026-03-18 15:55:55 +0000 UTC" firstStartedPulling="2026-03-18 15:55:57.76721959 +0000 UTC m=+1306.636548517" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:11.258146838 +0000 UTC m=+1380.127475795" watchObservedRunningTime="2026-03-18 15:57:11.277495735 +0000 UTC m=+1380.146824672" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.308370 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.356759 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371960.498035 podStartE2EDuration="1m16.356740174s" podCreationTimestamp="2026-03-18 15:55:55 +0000 UTC" firstStartedPulling="2026-03-18 15:55:57.245765167 +0000 UTC m=+1306.115094104" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:11.296862423 +0000 UTC m=+1380.166191360" watchObservedRunningTime="2026-03-18 15:57:11.356740174 +0000 UTC m=+1380.226069111" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.378215 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ghqxp"] Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.405275 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ghqxp"] Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.596081 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m977k-config-b6v8z"] Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.614046 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b6qv9"] Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.617061 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.631363 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.660521 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b6qv9"] Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.760240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1ba4ec-faec-4924-bc29-306292f94c1d-operator-scripts\") pod \"root-account-create-update-b6qv9\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.760493 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dprb\" (UniqueName: \"kubernetes.io/projected/4a1ba4ec-faec-4924-bc29-306292f94c1d-kube-api-access-6dprb\") pod \"root-account-create-update-b6qv9\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.864982 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dprb\" (UniqueName: \"kubernetes.io/projected/4a1ba4ec-faec-4924-bc29-306292f94c1d-kube-api-access-6dprb\") pod \"root-account-create-update-b6qv9\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.866754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1ba4ec-faec-4924-bc29-306292f94c1d-operator-scripts\") pod \"root-account-create-update-b6qv9\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.867534 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1ba4ec-faec-4924-bc29-306292f94c1d-operator-scripts\") pod \"root-account-create-update-b6qv9\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.893149 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dprb\" (UniqueName: \"kubernetes.io/projected/4a1ba4ec-faec-4924-bc29-306292f94c1d-kube-api-access-6dprb\") pod \"root-account-create-update-b6qv9\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:11 crc kubenswrapper[4792]: I0318 15:57:11.907987 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" path="/var/lib/kubelet/pods/8e546b43-e259-4d0e-81f8-2381371157bd/volumes" Mar 18 15:57:12 crc kubenswrapper[4792]: I0318 15:57:12.018538 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:12 crc kubenswrapper[4792]: I0318 15:57:12.219463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k-config-b6v8z" event={"ID":"de52bdb9-b29a-4551-aa15-0a43e2b88909","Type":"ContainerStarted","Data":"b05bf655c6c629d37d3415b4fb0c06026d40ddfe51fc5f4cfa3001c67e0d8277"} Mar 18 15:57:12 crc kubenswrapper[4792]: I0318 15:57:12.219523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k-config-b6v8z" event={"ID":"de52bdb9-b29a-4551-aa15-0a43e2b88909","Type":"ContainerStarted","Data":"7f2451bff71d9612454e1ce4f515fac2de1efe7ff934f50915fb6a51e837315c"} Mar 18 15:57:12 crc kubenswrapper[4792]: I0318 15:57:12.241204 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m977k-config-b6v8z" podStartSLOduration=2.241181904 podStartE2EDuration="2.241181904s" podCreationTimestamp="2026-03-18 15:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:12.239043075 +0000 UTC m=+1381.108372032" watchObservedRunningTime="2026-03-18 15:57:12.241181904 +0000 UTC m=+1381.110510851" Mar 18 15:57:13 crc kubenswrapper[4792]: I0318 15:57:13.248401 4792 generic.go:334] "Generic (PLEG): container finished" podID="de52bdb9-b29a-4551-aa15-0a43e2b88909" containerID="b05bf655c6c629d37d3415b4fb0c06026d40ddfe51fc5f4cfa3001c67e0d8277" exitCode=0 Mar 18 15:57:13 crc kubenswrapper[4792]: I0318 15:57:13.248453 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k-config-b6v8z" event={"ID":"de52bdb9-b29a-4551-aa15-0a43e2b88909","Type":"ContainerDied","Data":"b05bf655c6c629d37d3415b4fb0c06026d40ddfe51fc5f4cfa3001c67e0d8277"} Mar 18 15:57:13 crc kubenswrapper[4792]: I0318 15:57:13.727509 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:13 crc kubenswrapper[4792]: I0318 15:57:13.739794 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b6qv9"] Mar 18 15:57:13 crc kubenswrapper[4792]: I0318 15:57:13.778726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.263033 4792 generic.go:334] "Generic (PLEG): container finished" podID="5670c2b9-9a80-4670-a2d2-0135fbb5a77d" containerID="9aebd5d02c7c9b02bb4aba9a8be3b0d9f5ce8e4d3ebf7d71c7b27f99ffb7c783" exitCode=0 Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.263104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rks8d" event={"ID":"5670c2b9-9a80-4670-a2d2-0135fbb5a77d","Type":"ContainerDied","Data":"9aebd5d02c7c9b02bb4aba9a8be3b0d9f5ce8e4d3ebf7d71c7b27f99ffb7c783"} Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.267450 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b6qv9" event={"ID":"4a1ba4ec-faec-4924-bc29-306292f94c1d","Type":"ContainerStarted","Data":"9ddaf409b452419ceb06cf8585000096f662cf10bf6a11a5d59850cf54eba88c"} Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.267480 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b6qv9" event={"ID":"4a1ba4ec-faec-4924-bc29-306292f94c1d","Type":"ContainerStarted","Data":"1b0568658fc85cb8c468af06c6c6c14f6eed4870a76bb4a9993ab4d9abfae6c9"} Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.271982 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"269b9513-d2bb-4890-b0fc-11271ef50754","Type":"ContainerStarted","Data":"067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3"} Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.299741 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-b6qv9" podStartSLOduration=3.299710588 podStartE2EDuration="3.299710588s" podCreationTimestamp="2026-03-18 15:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:14.294636256 +0000 UTC m=+1383.163965223" watchObservedRunningTime="2026-03-18 15:57:14.299710588 +0000 UTC m=+1383.169039525" Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.321266 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.466634991 podStartE2EDuration="6.321228595s" podCreationTimestamp="2026-03-18 15:57:08 +0000 UTC" firstStartedPulling="2026-03-18 15:57:10.573423332 +0000 UTC m=+1379.442752269" lastFinishedPulling="2026-03-18 15:57:13.428016936 +0000 UTC m=+1382.297345873" observedRunningTime="2026-03-18 15:57:14.312939111 +0000 UTC m=+1383.182268068" watchObservedRunningTime="2026-03-18 15:57:14.321228595 +0000 UTC m=+1383.190557532" Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.697663 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-m977k" Mar 18 15:57:14 crc kubenswrapper[4792]: I0318 15:57:14.908338 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.062277 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-scripts\") pod \"de52bdb9-b29a-4551-aa15-0a43e2b88909\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.062358 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h252f\" (UniqueName: \"kubernetes.io/projected/de52bdb9-b29a-4551-aa15-0a43e2b88909-kube-api-access-h252f\") pod \"de52bdb9-b29a-4551-aa15-0a43e2b88909\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.062520 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-log-ovn\") pod \"de52bdb9-b29a-4551-aa15-0a43e2b88909\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.062614 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-additional-scripts\") pod \"de52bdb9-b29a-4551-aa15-0a43e2b88909\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.062678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run\") pod \"de52bdb9-b29a-4551-aa15-0a43e2b88909\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.062704 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run-ovn\") pod \"de52bdb9-b29a-4551-aa15-0a43e2b88909\" (UID: \"de52bdb9-b29a-4551-aa15-0a43e2b88909\") " Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.062865 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "de52bdb9-b29a-4551-aa15-0a43e2b88909" (UID: "de52bdb9-b29a-4551-aa15-0a43e2b88909"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.063388 4792 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.063393 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run" (OuterVolumeSpecName: "var-run") pod "de52bdb9-b29a-4551-aa15-0a43e2b88909" (UID: "de52bdb9-b29a-4551-aa15-0a43e2b88909"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.063438 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "de52bdb9-b29a-4551-aa15-0a43e2b88909" (UID: "de52bdb9-b29a-4551-aa15-0a43e2b88909"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.064218 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "de52bdb9-b29a-4551-aa15-0a43e2b88909" (UID: "de52bdb9-b29a-4551-aa15-0a43e2b88909"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.064390 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-scripts" (OuterVolumeSpecName: "scripts") pod "de52bdb9-b29a-4551-aa15-0a43e2b88909" (UID: "de52bdb9-b29a-4551-aa15-0a43e2b88909"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.099307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de52bdb9-b29a-4551-aa15-0a43e2b88909-kube-api-access-h252f" (OuterVolumeSpecName: "kube-api-access-h252f") pod "de52bdb9-b29a-4551-aa15-0a43e2b88909" (UID: "de52bdb9-b29a-4551-aa15-0a43e2b88909"). InnerVolumeSpecName "kube-api-access-h252f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.164695 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-ghqxp" podUID="8e546b43-e259-4d0e-81f8-2381371157bd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: i/o timeout" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.165192 4792 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.165224 4792 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.165235 4792 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de52bdb9-b29a-4551-aa15-0a43e2b88909-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.165250 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de52bdb9-b29a-4551-aa15-0a43e2b88909-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.165264 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h252f\" (UniqueName: \"kubernetes.io/projected/de52bdb9-b29a-4551-aa15-0a43e2b88909-kube-api-access-h252f\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.290647 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a1ba4ec-faec-4924-bc29-306292f94c1d" containerID="9ddaf409b452419ceb06cf8585000096f662cf10bf6a11a5d59850cf54eba88c" exitCode=0 Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.290734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b6qv9" event={"ID":"4a1ba4ec-faec-4924-bc29-306292f94c1d","Type":"ContainerDied","Data":"9ddaf409b452419ceb06cf8585000096f662cf10bf6a11a5d59850cf54eba88c"} Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.293938 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k-config-b6v8z" event={"ID":"de52bdb9-b29a-4551-aa15-0a43e2b88909","Type":"ContainerDied","Data":"7f2451bff71d9612454e1ce4f515fac2de1efe7ff934f50915fb6a51e837315c"} Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.294008 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f2451bff71d9612454e1ce4f515fac2de1efe7ff934f50915fb6a51e837315c" Mar 18 15:57:15 crc kubenswrapper[4792]: I0318 15:57:15.294168 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-b6v8z" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.102910 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m977k-config-b6v8z"] Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.112881 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m977k-config-b6v8z"] Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.251570 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m977k-config-wdgkd"] Mar 18 15:57:16 crc kubenswrapper[4792]: E0318 15:57:16.252166 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de52bdb9-b29a-4551-aa15-0a43e2b88909" containerName="ovn-config" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.252190 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de52bdb9-b29a-4551-aa15-0a43e2b88909" containerName="ovn-config" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.252464 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de52bdb9-b29a-4551-aa15-0a43e2b88909" containerName="ovn-config" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.253442 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.255488 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.276945 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m977k-config-wdgkd"] Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.308837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.309356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-scripts\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.309395 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run-ovn\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.309517 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpt9t\" (UniqueName: \"kubernetes.io/projected/ee4f0ea5-3787-4902-a07f-b86c36415b5d-kube-api-access-rpt9t\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.309548 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-log-ovn\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.309737 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-additional-scripts\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.412927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run-ovn\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.413040 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-log-ovn\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.413069 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpt9t\" (UniqueName: \"kubernetes.io/projected/ee4f0ea5-3787-4902-a07f-b86c36415b5d-kube-api-access-rpt9t\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.413121 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-additional-scripts\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.413182 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.413342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-scripts\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.413405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run-ovn\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.414387 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-log-ovn\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.414757 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.416305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-additional-scripts\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.425236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-scripts\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.456180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpt9t\" (UniqueName: \"kubernetes.io/projected/ee4f0ea5-3787-4902-a07f-b86c36415b5d-kube-api-access-rpt9t\") pod \"ovn-controller-m977k-config-wdgkd\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:16 crc kubenswrapper[4792]: I0318 15:57:16.575804 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:17 crc kubenswrapper[4792]: I0318 15:57:17.093203 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 18 15:57:17 crc kubenswrapper[4792]: I0318 15:57:17.879425 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de52bdb9-b29a-4551-aa15-0a43e2b88909" path="/var/lib/kubelet/pods/de52bdb9-b29a-4551-aa15-0a43e2b88909/volumes" Mar 18 15:57:18 crc kubenswrapper[4792]: I0318 15:57:18.727738 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:18 crc kubenswrapper[4792]: I0318 15:57:18.730983 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:19 crc kubenswrapper[4792]: I0318 15:57:19.351155 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:21 crc kubenswrapper[4792]: I0318 15:57:21.993569 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:57:21 crc kubenswrapper[4792]: I0318 15:57:21.994665 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="prometheus" containerID="cri-o://e313a088eafeb6957bebebf6888575180f7655f78c16d6183ae9dcb439f10ee5" gracePeriod=600 Mar 18 15:57:21 crc kubenswrapper[4792]: I0318 15:57:21.994854 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="thanos-sidecar" containerID="cri-o://c8f115ea38945674ba812fd56982b41e65a502103e68f5239b68b21bba217f9e" gracePeriod=600 Mar 18 15:57:21 crc kubenswrapper[4792]: I0318 15:57:21.994908 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="config-reloader" containerID="cri-o://4f0591bb4e037d2141fc349b99e7b56de5eaeeb03c840b837da1ffde0a26b88a" gracePeriod=600 Mar 18 15:57:22 crc kubenswrapper[4792]: I0318 15:57:22.384466 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerID="c8f115ea38945674ba812fd56982b41e65a502103e68f5239b68b21bba217f9e" exitCode=0 Mar 18 15:57:22 crc kubenswrapper[4792]: I0318 15:57:22.384522 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerID="4f0591bb4e037d2141fc349b99e7b56de5eaeeb03c840b837da1ffde0a26b88a" exitCode=0 Mar 18 15:57:22 crc kubenswrapper[4792]: I0318 15:57:22.384532 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerID="e313a088eafeb6957bebebf6888575180f7655f78c16d6183ae9dcb439f10ee5" exitCode=0 Mar 18 15:57:22 crc kubenswrapper[4792]: I0318 15:57:22.384555 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerDied","Data":"c8f115ea38945674ba812fd56982b41e65a502103e68f5239b68b21bba217f9e"} Mar 18 15:57:22 crc kubenswrapper[4792]: I0318 15:57:22.384584 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerDied","Data":"4f0591bb4e037d2141fc349b99e7b56de5eaeeb03c840b837da1ffde0a26b88a"} Mar 18 15:57:22 crc kubenswrapper[4792]: I0318 15:57:22.384598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerDied","Data":"e313a088eafeb6957bebebf6888575180f7655f78c16d6183ae9dcb439f10ee5"} Mar 18 15:57:23 crc kubenswrapper[4792]: I0318 15:57:23.727657 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.143:9090/-/ready\": dial tcp 10.217.0.143:9090: connect: connection refused" Mar 18 15:57:25 crc kubenswrapper[4792]: I0318 15:57:25.340096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:57:25 crc kubenswrapper[4792]: I0318 15:57:25.347616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3c345496-7b4e-41f0-a5ae-4c503e452221-etc-swift\") pod \"swift-storage-0\" (UID: \"3c345496-7b4e-41f0-a5ae-4c503e452221\") " pod="openstack/swift-storage-0" Mar 18 15:57:25 crc kubenswrapper[4792]: I0318 15:57:25.432495 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 15:57:26 crc kubenswrapper[4792]: E0318 15:57:26.030643 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 18 15:57:26 crc kubenswrapper[4792]: E0318 15:57:26.030873 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnkc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-mqtz6_openstack(bc4c925f-3e43-4351-9703-7fdd44a1a9d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:57:26 crc kubenswrapper[4792]: E0318 15:57:26.032439 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-mqtz6" podUID="bc4c925f-3e43-4351-9703-7fdd44a1a9d6" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.153390 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.170558 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.269784 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-combined-ca-bundle\") pod \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270163 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1ba4ec-faec-4924-bc29-306292f94c1d-operator-scripts\") pod \"4a1ba4ec-faec-4924-bc29-306292f94c1d\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270255 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-dispersionconf\") pod \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270276 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-etc-swift\") pod \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-scripts\") pod \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdb55\" (UniqueName: \"kubernetes.io/projected/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-kube-api-access-pdb55\") pod \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270424 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dprb\" (UniqueName: \"kubernetes.io/projected/4a1ba4ec-faec-4924-bc29-306292f94c1d-kube-api-access-6dprb\") pod \"4a1ba4ec-faec-4924-bc29-306292f94c1d\" (UID: \"4a1ba4ec-faec-4924-bc29-306292f94c1d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270448 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-ring-data-devices\") pod \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.270584 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-swiftconf\") pod \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\" (UID: \"5670c2b9-9a80-4670-a2d2-0135fbb5a77d\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.272044 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5670c2b9-9a80-4670-a2d2-0135fbb5a77d" (UID: "5670c2b9-9a80-4670-a2d2-0135fbb5a77d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.272170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1ba4ec-faec-4924-bc29-306292f94c1d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a1ba4ec-faec-4924-bc29-306292f94c1d" (UID: "4a1ba4ec-faec-4924-bc29-306292f94c1d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.272245 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5670c2b9-9a80-4670-a2d2-0135fbb5a77d" (UID: "5670c2b9-9a80-4670-a2d2-0135fbb5a77d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.275284 4792 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.275315 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1ba4ec-faec-4924-bc29-306292f94c1d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.275327 4792 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.280112 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-kube-api-access-pdb55" (OuterVolumeSpecName: "kube-api-access-pdb55") pod "5670c2b9-9a80-4670-a2d2-0135fbb5a77d" (UID: "5670c2b9-9a80-4670-a2d2-0135fbb5a77d"). InnerVolumeSpecName "kube-api-access-pdb55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.280320 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1ba4ec-faec-4924-bc29-306292f94c1d-kube-api-access-6dprb" (OuterVolumeSpecName: "kube-api-access-6dprb") pod "4a1ba4ec-faec-4924-bc29-306292f94c1d" (UID: "4a1ba4ec-faec-4924-bc29-306292f94c1d"). InnerVolumeSpecName "kube-api-access-6dprb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.286394 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5670c2b9-9a80-4670-a2d2-0135fbb5a77d" (UID: "5670c2b9-9a80-4670-a2d2-0135fbb5a77d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.330232 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5670c2b9-9a80-4670-a2d2-0135fbb5a77d" (UID: "5670c2b9-9a80-4670-a2d2-0135fbb5a77d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.330782 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-scripts" (OuterVolumeSpecName: "scripts") pod "5670c2b9-9a80-4670-a2d2-0135fbb5a77d" (UID: "5670c2b9-9a80-4670-a2d2-0135fbb5a77d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.332207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5670c2b9-9a80-4670-a2d2-0135fbb5a77d" (UID: "5670c2b9-9a80-4670-a2d2-0135fbb5a77d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.377889 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdb55\" (UniqueName: \"kubernetes.io/projected/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-kube-api-access-pdb55\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.377931 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dprb\" (UniqueName: \"kubernetes.io/projected/4a1ba4ec-faec-4924-bc29-306292f94c1d-kube-api-access-6dprb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.377946 4792 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.377960 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.377986 4792 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.377999 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5670c2b9-9a80-4670-a2d2-0135fbb5a77d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.432416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b6qv9" event={"ID":"4a1ba4ec-faec-4924-bc29-306292f94c1d","Type":"ContainerDied","Data":"1b0568658fc85cb8c468af06c6c6c14f6eed4870a76bb4a9993ab4d9abfae6c9"} Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.432471 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0568658fc85cb8c468af06c6c6c14f6eed4870a76bb4a9993ab4d9abfae6c9" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.432539 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b6qv9" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.437120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a","Type":"ContainerDied","Data":"7d2659b56726052c639620b2f307854a1360fef0fb4eb51726208730f736638b"} Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.437163 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d2659b56726052c639620b2f307854a1360fef0fb4eb51726208730f736638b" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.439860 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rks8d" Mar 18 15:57:26 crc kubenswrapper[4792]: E0318 15:57:26.446823 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-mqtz6" podUID="bc4c925f-3e43-4351-9703-7fdd44a1a9d6" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.446910 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rks8d" event={"ID":"5670c2b9-9a80-4670-a2d2-0135fbb5a77d","Type":"ContainerDied","Data":"d61f27e594e0e8cc7ff8a734d3cbb93dde9492ca7c102ba164b184027e3344b6"} Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.447165 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d61f27e594e0e8cc7ff8a734d3cbb93dde9492ca7c102ba164b184027e3344b6" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.447752 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.472368 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.542530 4792 scope.go:117] "RemoveContainer" containerID="0f1ad947bfab13337bc7241c0610d2c4eb91bc8bb5dd93a30b161eed8e766a2d" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.581424 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m977k-config-wdgkd"] Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.582735 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.582814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-tls-assets\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.582846 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-web-config\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.582880 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-thanos-prometheus-http-client-file\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.583029 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cxlr\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-kube-api-access-4cxlr\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.583074 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-2\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.583237 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.583278 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-1\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.583359 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config-out\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.583391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-0\") pod \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\" (UID: \"9f8f9d61-a4a8-4579-9d91-722dfe7aa68a\") " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.584960 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.585856 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.586008 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.589345 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.590197 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-kube-api-access-4cxlr" (OuterVolumeSpecName: "kube-api-access-4cxlr") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "kube-api-access-4cxlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.591170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config-out" (OuterVolumeSpecName: "config-out") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.595848 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config" (OuterVolumeSpecName: "config") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.597757 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.652131 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-web-config" (OuterVolumeSpecName: "web-config") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.671773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" (UID: "9f8f9d61-a4a8-4579-9d91-722dfe7aa68a"). InnerVolumeSpecName "pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:57:26 crc kubenswrapper[4792]: W0318 15:57:26.685927 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee4f0ea5_3787_4902_a07f_b86c36415b5d.slice/crio-a90ea3bc110bbe6c38e5518974c71e5ab428066e73803ef122004d1269baf892 WatchSource:0}: Error finding container a90ea3bc110bbe6c38e5518974c71e5ab428066e73803ef122004d1269baf892: Status 404 returned error can't find the container with id a90ea3bc110bbe6c38e5518974c71e5ab428066e73803ef122004d1269baf892 Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.706868 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cxlr\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-kube-api-access-4cxlr\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.706898 4792 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.706937 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") on node \"crc\" " Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.706951 4792 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.706961 4792 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config-out\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.707010 4792 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.707024 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.707033 4792 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.707043 4792 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-web-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.707057 4792 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.773492 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.779334 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.779482 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9") on node "crc" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.784893 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.811023 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:26 crc kubenswrapper[4792]: I0318 15:57:26.943728 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.094209 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.461272 4792 generic.go:334] "Generic (PLEG): container finished" podID="ee4f0ea5-3787-4902-a07f-b86c36415b5d" containerID="3e98eef60b92763f403fa2d548387d3ef4ea41ccf46420bd01da67c93ee7b5f9" exitCode=0 Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.461823 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k-config-wdgkd" event={"ID":"ee4f0ea5-3787-4902-a07f-b86c36415b5d","Type":"ContainerDied","Data":"3e98eef60b92763f403fa2d548387d3ef4ea41ccf46420bd01da67c93ee7b5f9"} Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.461859 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k-config-wdgkd" event={"ID":"ee4f0ea5-3787-4902-a07f-b86c36415b5d","Type":"ContainerStarted","Data":"a90ea3bc110bbe6c38e5518974c71e5ab428066e73803ef122004d1269baf892"} Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.466581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"65eccc25a3a69a3778b73d3746172b1f093106026ff613c89342722fab78394d"} Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.466634 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.547411 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.555880 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.575770 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:57:27 crc kubenswrapper[4792]: E0318 15:57:27.576278 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5670c2b9-9a80-4670-a2d2-0135fbb5a77d" containerName="swift-ring-rebalance" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.576460 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5670c2b9-9a80-4670-a2d2-0135fbb5a77d" containerName="swift-ring-rebalance" Mar 18 15:57:27 crc kubenswrapper[4792]: E0318 15:57:27.576483 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="config-reloader" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.576491 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="config-reloader" Mar 18 15:57:27 crc kubenswrapper[4792]: E0318 15:57:27.576509 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="thanos-sidecar" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.576527 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="thanos-sidecar" Mar 18 15:57:27 crc kubenswrapper[4792]: E0318 15:57:27.576537 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="init-config-reloader" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.576543 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="init-config-reloader" Mar 18 15:57:27 crc kubenswrapper[4792]: E0318 15:57:27.576567 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1ba4ec-faec-4924-bc29-306292f94c1d" containerName="mariadb-account-create-update" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.576579 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1ba4ec-faec-4924-bc29-306292f94c1d" containerName="mariadb-account-create-update" Mar 18 15:57:27 crc kubenswrapper[4792]: E0318 15:57:27.576597 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="prometheus" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.576603 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="prometheus" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.576961 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="config-reloader" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.577003 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="prometheus" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.577015 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5670c2b9-9a80-4670-a2d2-0135fbb5a77d" containerName="swift-ring-rebalance" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.577027 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1ba4ec-faec-4924-bc29-306292f94c1d" containerName="mariadb-account-create-update" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.577040 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" containerName="thanos-sidecar" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.579427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.588913 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.589274 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.589304 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.589486 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.589761 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.589956 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-srpm2" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.592079 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.592442 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.595311 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.599126 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.641819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.641892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.641919 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642007 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbzjx\" (UniqueName: \"kubernetes.io/projected/fb295773-c070-4d90-b351-cac7e8fa1017-kube-api-access-fbzjx\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642074 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642096 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb295773-c070-4d90-b351-cac7e8fa1017-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb295773-c070-4d90-b351-cac7e8fa1017-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642347 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.642504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745138 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745265 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745359 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbzjx\" (UniqueName: \"kubernetes.io/projected/fb295773-c070-4d90-b351-cac7e8fa1017-kube-api-access-fbzjx\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745467 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745680 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb295773-c070-4d90-b351-cac7e8fa1017-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb295773-c070-4d90-b351-cac7e8fa1017-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745826 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745917 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.745946 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.747260 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.748423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.748999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb295773-c070-4d90-b351-cac7e8fa1017-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.753710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.757924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.758233 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.758329 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb295773-c070-4d90-b351-cac7e8fa1017-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.758937 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.760481 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.760735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb295773-c070-4d90-b351-cac7e8fa1017-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.760816 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb295773-c070-4d90-b351-cac7e8fa1017-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.774756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbzjx\" (UniqueName: \"kubernetes.io/projected/fb295773-c070-4d90-b351-cac7e8fa1017-kube-api-access-fbzjx\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.861732 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.861783 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/45fd3dc2e95d3e59015b40d9f64664aa445807a0df4a2d1f8578969394abae77/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:27 crc kubenswrapper[4792]: I0318 15:57:27.909437 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8f9d61-a4a8-4579-9d91-722dfe7aa68a" path="/var/lib/kubelet/pods/9f8f9d61-a4a8-4579-9d91-722dfe7aa68a/volumes" Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.000150 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b6qv9"] Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.021385 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b6qv9"] Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.165223 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb4d23b2-5647-47c0-bd53-07042c5d1dd9\") pod \"prometheus-metric-storage-0\" (UID: \"fb295773-c070-4d90-b351-cac7e8fa1017\") " pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.206603 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.857464 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-9t8k4"] Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.867168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.936111 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.989968 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9t8k4"] Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.993520 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162c4e7b-9b94-4363-aa4e-25cbb6cce669-operator-scripts\") pod \"heat-db-create-9t8k4\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:28 crc kubenswrapper[4792]: I0318 15:57:28.993774 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/162c4e7b-9b94-4363-aa4e-25cbb6cce669-kube-api-access-7828p\") pod \"heat-db-create-9t8k4\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.032987 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-65b5-account-create-update-d89tz"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.034367 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.037042 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.057424 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-65b5-account-create-update-d89tz"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.079541 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sb8pv"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.082559 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.096416 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/162c4e7b-9b94-4363-aa4e-25cbb6cce669-kube-api-access-7828p\") pod \"heat-db-create-9t8k4\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.096477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddad064f-85b2-4334-9b1f-af2e8037a328-operator-scripts\") pod \"cinder-65b5-account-create-update-d89tz\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.096513 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162c4e7b-9b94-4363-aa4e-25cbb6cce669-operator-scripts\") pod \"heat-db-create-9t8k4\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.096651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fssg2\" (UniqueName: \"kubernetes.io/projected/ddad064f-85b2-4334-9b1f-af2e8037a328-kube-api-access-fssg2\") pod \"cinder-65b5-account-create-update-d89tz\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.097690 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162c4e7b-9b94-4363-aa4e-25cbb6cce669-operator-scripts\") pod \"heat-db-create-9t8k4\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.105694 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sb8pv"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.130221 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/162c4e7b-9b94-4363-aa4e-25cbb6cce669-kube-api-access-7828p\") pod \"heat-db-create-9t8k4\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.198678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddad064f-85b2-4334-9b1f-af2e8037a328-operator-scripts\") pod \"cinder-65b5-account-create-update-d89tz\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.198913 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fssg2\" (UniqueName: \"kubernetes.io/projected/ddad064f-85b2-4334-9b1f-af2e8037a328-kube-api-access-fssg2\") pod \"cinder-65b5-account-create-update-d89tz\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.198994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/124caafa-8fb5-40be-b0bb-233a7848176f-operator-scripts\") pod \"cinder-db-create-sb8pv\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.199023 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvkt6\" (UniqueName: \"kubernetes.io/projected/124caafa-8fb5-40be-b0bb-233a7848176f-kube-api-access-bvkt6\") pod \"cinder-db-create-sb8pv\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.200508 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddad064f-85b2-4334-9b1f-af2e8037a328-operator-scripts\") pod \"cinder-65b5-account-create-update-d89tz\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.223074 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5wbx7"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.224942 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.228117 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.236219 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5wbx7"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.247635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fssg2\" (UniqueName: \"kubernetes.io/projected/ddad064f-85b2-4334-9b1f-af2e8037a328-kube-api-access-fssg2\") pod \"cinder-65b5-account-create-update-d89tz\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.247875 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-da5b-account-create-update-t7nwf"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.249396 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.257099 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.262112 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5x8d9"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.264017 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.273842 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.274144 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.274439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.274710 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-b84g6" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.276797 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5x8d9"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.295862 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-da5b-account-create-update-t7nwf"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.300655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/124caafa-8fb5-40be-b0bb-233a7848176f-operator-scripts\") pod \"cinder-db-create-sb8pv\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.300704 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvkt6\" (UniqueName: \"kubernetes.io/projected/124caafa-8fb5-40be-b0bb-233a7848176f-kube-api-access-bvkt6\") pod \"cinder-db-create-sb8pv\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.300867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm8nj\" (UniqueName: \"kubernetes.io/projected/3282b7a4-a673-4f26-9395-3fbcfe76fea4-kube-api-access-jm8nj\") pod \"neutron-db-create-5wbx7\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.301010 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3282b7a4-a673-4f26-9395-3fbcfe76fea4-operator-scripts\") pod \"neutron-db-create-5wbx7\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.302666 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/124caafa-8fb5-40be-b0bb-233a7848176f-operator-scripts\") pod \"cinder-db-create-sb8pv\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.340622 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rrksn"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.341579 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvkt6\" (UniqueName: \"kubernetes.io/projected/124caafa-8fb5-40be-b0bb-233a7848176f-kube-api-access-bvkt6\") pod \"cinder-db-create-sb8pv\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.342063 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.357523 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rrksn"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.384501 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d46fc90c-8795-46d0-b6b4-e386c126ff37-operator-scripts\") pod \"heat-da5b-account-create-update-t7nwf\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403471 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw7z7\" (UniqueName: \"kubernetes.io/projected/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-kube-api-access-sw7z7\") pod \"barbican-db-create-rrksn\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403602 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-config-data\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-combined-ca-bundle\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403656 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5hb\" (UniqueName: \"kubernetes.io/projected/d46fc90c-8795-46d0-b6b4-e386c126ff37-kube-api-access-fw5hb\") pod \"heat-da5b-account-create-update-t7nwf\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403700 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm8nj\" (UniqueName: \"kubernetes.io/projected/3282b7a4-a673-4f26-9395-3fbcfe76fea4-kube-api-access-jm8nj\") pod \"neutron-db-create-5wbx7\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403792 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3282b7a4-a673-4f26-9395-3fbcfe76fea4-operator-scripts\") pod \"neutron-db-create-5wbx7\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403852 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np4r5\" (UniqueName: \"kubernetes.io/projected/7efa06c0-f363-43b0-ba89-96d41ff9db74-kube-api-access-np4r5\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.403883 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-operator-scripts\") pod \"barbican-db-create-rrksn\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.404830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3282b7a4-a673-4f26-9395-3fbcfe76fea4-operator-scripts\") pod \"neutron-db-create-5wbx7\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.420612 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.434765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm8nj\" (UniqueName: \"kubernetes.io/projected/3282b7a4-a673-4f26-9395-3fbcfe76fea4-kube-api-access-jm8nj\") pod \"neutron-db-create-5wbx7\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.505572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np4r5\" (UniqueName: \"kubernetes.io/projected/7efa06c0-f363-43b0-ba89-96d41ff9db74-kube-api-access-np4r5\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.505647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-operator-scripts\") pod \"barbican-db-create-rrksn\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.505787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d46fc90c-8795-46d0-b6b4-e386c126ff37-operator-scripts\") pod \"heat-da5b-account-create-update-t7nwf\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.505849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw7z7\" (UniqueName: \"kubernetes.io/projected/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-kube-api-access-sw7z7\") pod \"barbican-db-create-rrksn\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.506041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-config-data\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.506076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-combined-ca-bundle\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.506136 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5hb\" (UniqueName: \"kubernetes.io/projected/d46fc90c-8795-46d0-b6b4-e386c126ff37-kube-api-access-fw5hb\") pod \"heat-da5b-account-create-update-t7nwf\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.506676 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-operator-scripts\") pod \"barbican-db-create-rrksn\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.507131 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d46fc90c-8795-46d0-b6b4-e386c126ff37-operator-scripts\") pod \"heat-da5b-account-create-update-t7nwf\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.511179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-config-data\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.514827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-combined-ca-bundle\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.550254 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np4r5\" (UniqueName: \"kubernetes.io/projected/7efa06c0-f363-43b0-ba89-96d41ff9db74-kube-api-access-np4r5\") pod \"keystone-db-sync-5x8d9\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.551928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw7z7\" (UniqueName: \"kubernetes.io/projected/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-kube-api-access-sw7z7\") pod \"barbican-db-create-rrksn\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.555992 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5hb\" (UniqueName: \"kubernetes.io/projected/d46fc90c-8795-46d0-b6b4-e386c126ff37-kube-api-access-fw5hb\") pod \"heat-da5b-account-create-update-t7nwf\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.558500 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-644b-account-create-update-slgb4"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.560109 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.565066 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.571680 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-644b-account-create-update-slgb4"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.597343 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.614722 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.621437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.663400 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b0df-account-create-update-2r2xz"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.665638 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.672457 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b0df-account-create-update-2r2xz"] Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.678458 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.696678 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.711007 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14da68d-4090-4890-ba72-195a943a722b-operator-scripts\") pod \"neutron-644b-account-create-update-slgb4\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.711057 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnh9h\" (UniqueName: \"kubernetes.io/projected/f14da68d-4090-4890-ba72-195a943a722b-kube-api-access-cnh9h\") pod \"neutron-644b-account-create-update-slgb4\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.783336 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.813193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmtn\" (UniqueName: \"kubernetes.io/projected/6755e276-5f4a-45db-850e-97ff887e55ae-kube-api-access-mdmtn\") pod \"barbican-b0df-account-create-update-2r2xz\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.813270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6755e276-5f4a-45db-850e-97ff887e55ae-operator-scripts\") pod \"barbican-b0df-account-create-update-2r2xz\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.813337 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14da68d-4090-4890-ba72-195a943a722b-operator-scripts\") pod \"neutron-644b-account-create-update-slgb4\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.813382 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnh9h\" (UniqueName: \"kubernetes.io/projected/f14da68d-4090-4890-ba72-195a943a722b-kube-api-access-cnh9h\") pod \"neutron-644b-account-create-update-slgb4\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.814191 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14da68d-4090-4890-ba72-195a943a722b-operator-scripts\") pod \"neutron-644b-account-create-update-slgb4\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.835053 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnh9h\" (UniqueName: \"kubernetes.io/projected/f14da68d-4090-4890-ba72-195a943a722b-kube-api-access-cnh9h\") pod \"neutron-644b-account-create-update-slgb4\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.897601 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1ba4ec-faec-4924-bc29-306292f94c1d" path="/var/lib/kubelet/pods/4a1ba4ec-faec-4924-bc29-306292f94c1d/volumes" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.914568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-additional-scripts\") pod \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.914668 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpt9t\" (UniqueName: \"kubernetes.io/projected/ee4f0ea5-3787-4902-a07f-b86c36415b5d-kube-api-access-rpt9t\") pod \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.914827 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-scripts\") pod \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.914854 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run\") pod \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915025 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-log-ovn\") pod \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915115 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run-ovn\") pod \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\" (UID: \"ee4f0ea5-3787-4902-a07f-b86c36415b5d\") " Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmtn\" (UniqueName: \"kubernetes.io/projected/6755e276-5f4a-45db-850e-97ff887e55ae-kube-api-access-mdmtn\") pod \"barbican-b0df-account-create-update-2r2xz\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915616 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6755e276-5f4a-45db-850e-97ff887e55ae-operator-scripts\") pod \"barbican-b0df-account-create-update-2r2xz\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run" (OuterVolumeSpecName: "var-run") pod "ee4f0ea5-3787-4902-a07f-b86c36415b5d" (UID: "ee4f0ea5-3787-4902-a07f-b86c36415b5d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915833 4792 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915870 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ee4f0ea5-3787-4902-a07f-b86c36415b5d" (UID: "ee4f0ea5-3787-4902-a07f-b86c36415b5d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.915893 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ee4f0ea5-3787-4902-a07f-b86c36415b5d" (UID: "ee4f0ea5-3787-4902-a07f-b86c36415b5d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.916440 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ee4f0ea5-3787-4902-a07f-b86c36415b5d" (UID: "ee4f0ea5-3787-4902-a07f-b86c36415b5d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.917747 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-scripts" (OuterVolumeSpecName: "scripts") pod "ee4f0ea5-3787-4902-a07f-b86c36415b5d" (UID: "ee4f0ea5-3787-4902-a07f-b86c36415b5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.917919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6755e276-5f4a-45db-850e-97ff887e55ae-operator-scripts\") pod \"barbican-b0df-account-create-update-2r2xz\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.922287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4f0ea5-3787-4902-a07f-b86c36415b5d-kube-api-access-rpt9t" (OuterVolumeSpecName: "kube-api-access-rpt9t") pod "ee4f0ea5-3787-4902-a07f-b86c36415b5d" (UID: "ee4f0ea5-3787-4902-a07f-b86c36415b5d"). InnerVolumeSpecName "kube-api-access-rpt9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.931552 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:29 crc kubenswrapper[4792]: I0318 15:57:29.943233 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmtn\" (UniqueName: \"kubernetes.io/projected/6755e276-5f4a-45db-850e-97ff887e55ae-kube-api-access-mdmtn\") pod \"barbican-b0df-account-create-update-2r2xz\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.019601 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpt9t\" (UniqueName: \"kubernetes.io/projected/ee4f0ea5-3787-4902-a07f-b86c36415b5d-kube-api-access-rpt9t\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.019654 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.019668 4792 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.019680 4792 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee4f0ea5-3787-4902-a07f-b86c36415b5d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.019691 4792 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee4f0ea5-3787-4902-a07f-b86c36415b5d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.083096 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.394344 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-65b5-account-create-update-d89tz"] Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.515018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb295773-c070-4d90-b351-cac7e8fa1017","Type":"ContainerStarted","Data":"af84ecd0bd7c3f9b53db84e7ce21db21401c4ba6c568b3480537f146d4bb486b"} Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.517820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m977k-config-wdgkd" event={"ID":"ee4f0ea5-3787-4902-a07f-b86c36415b5d","Type":"ContainerDied","Data":"a90ea3bc110bbe6c38e5518974c71e5ab428066e73803ef122004d1269baf892"} Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.517852 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90ea3bc110bbe6c38e5518974c71e5ab428066e73803ef122004d1269baf892" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.517929 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m977k-config-wdgkd" Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.602936 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-da5b-account-create-update-t7nwf"] Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.960404 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m977k-config-wdgkd"] Mar 18 15:57:30 crc kubenswrapper[4792]: I0318 15:57:30.982053 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m977k-config-wdgkd"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.044511 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5wbx7"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.347993 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5x8d9"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.375362 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9t8k4"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.533595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5x8d9" event={"ID":"7efa06c0-f363-43b0-ba89-96d41ff9db74","Type":"ContainerStarted","Data":"e7fe498e82aaff81599837bc25c7937f5be2db4cf341b727d9c1592082e95af6"} Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.535872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-65b5-account-create-update-d89tz" event={"ID":"ddad064f-85b2-4334-9b1f-af2e8037a328","Type":"ContainerStarted","Data":"3865d629257727c1c4760f19d6657167f013e0da3c90a9269032276ef90e21ff"} Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.537859 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-da5b-account-create-update-t7nwf" event={"ID":"d46fc90c-8795-46d0-b6b4-e386c126ff37","Type":"ContainerStarted","Data":"38876865f7c2f200914a27a717e7f33628b207504988b4f75c999a7f14f6eb55"} Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.539798 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5wbx7" event={"ID":"3282b7a4-a673-4f26-9395-3fbcfe76fea4","Type":"ContainerStarted","Data":"97ccf14140d203e352e8ec2ca855e5731f9aeac09982fe395ed6857ca582be42"} Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.541622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9t8k4" event={"ID":"162c4e7b-9b94-4363-aa4e-25cbb6cce669","Type":"ContainerStarted","Data":"9baf96070affea98129cdd99a103cb0ab974fb43c983c1fa7253b290f195920a"} Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.750058 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rrksn"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.763641 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-644b-account-create-update-slgb4"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.777903 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sb8pv"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.808806 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gxvdh"] Mar 18 15:57:31 crc kubenswrapper[4792]: E0318 15:57:31.809509 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4f0ea5-3787-4902-a07f-b86c36415b5d" containerName="ovn-config" Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.809540 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4f0ea5-3787-4902-a07f-b86c36415b5d" containerName="ovn-config" Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.809839 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4f0ea5-3787-4902-a07f-b86c36415b5d" containerName="ovn-config" Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.811283 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.828053 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gxvdh"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.834422 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b0df-account-create-update-2r2xz"] Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.840381 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.906358 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2jwg\" (UniqueName: \"kubernetes.io/projected/04d7066d-b000-4a19-a381-60766be81585-kube-api-access-r2jwg\") pod \"root-account-create-update-gxvdh\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.906726 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04d7066d-b000-4a19-a381-60766be81585-operator-scripts\") pod \"root-account-create-update-gxvdh\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:31 crc kubenswrapper[4792]: I0318 15:57:31.913340 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4f0ea5-3787-4902-a07f-b86c36415b5d" path="/var/lib/kubelet/pods/ee4f0ea5-3787-4902-a07f-b86c36415b5d/volumes" Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.019825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2jwg\" (UniqueName: \"kubernetes.io/projected/04d7066d-b000-4a19-a381-60766be81585-kube-api-access-r2jwg\") pod \"root-account-create-update-gxvdh\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.020017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04d7066d-b000-4a19-a381-60766be81585-operator-scripts\") pod \"root-account-create-update-gxvdh\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.022277 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04d7066d-b000-4a19-a381-60766be81585-operator-scripts\") pod \"root-account-create-update-gxvdh\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.043794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2jwg\" (UniqueName: \"kubernetes.io/projected/04d7066d-b000-4a19-a381-60766be81585-kube-api-access-r2jwg\") pod \"root-account-create-update-gxvdh\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.249305 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.551700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"26d1561f464b5a75a821c63dc0a34c09e2b7de8b63fbff5ace7622ac0c0138a1"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.553461 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-da5b-account-create-update-t7nwf" event={"ID":"d46fc90c-8795-46d0-b6b4-e386c126ff37","Type":"ContainerStarted","Data":"828e7cc77860e7196d1cbe036ea1978e973b72a92ba3b6d8c571f80418c6e60a"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.555617 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5wbx7" event={"ID":"3282b7a4-a673-4f26-9395-3fbcfe76fea4","Type":"ContainerStarted","Data":"dc0b9d721852424d88be1ae20b397b8b295c0f70a1ff28b55d95b176afb4fee5"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.556791 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rrksn" event={"ID":"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58","Type":"ContainerStarted","Data":"1958c4d8161500fa0af4abaf325be4be865f40b160eb821d282bc5d5c0177a1c"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.558726 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-65b5-account-create-update-d89tz" event={"ID":"ddad064f-85b2-4334-9b1f-af2e8037a328","Type":"ContainerStarted","Data":"39815edeed392e180c806a188fbbabdf3f1434bf2d373688b56d7e4407264003"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.560560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb8pv" event={"ID":"124caafa-8fb5-40be-b0bb-233a7848176f","Type":"ContainerStarted","Data":"6e0c5b1310f169b551ac32f590b2794206ce9939255fd3b3d72fa1519e554ed1"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.561963 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b0df-account-create-update-2r2xz" event={"ID":"6755e276-5f4a-45db-850e-97ff887e55ae","Type":"ContainerStarted","Data":"78c76449dc612be450086d1f61ddf0530361bd5be8e05713ac026234a246835d"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.569491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9t8k4" event={"ID":"162c4e7b-9b94-4363-aa4e-25cbb6cce669","Type":"ContainerStarted","Data":"3ebe52b0c9be12f9ba65dfc6e682f55a9d94b0898f08328ae8aaf62639e8df07"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.572301 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-644b-account-create-update-slgb4" event={"ID":"f14da68d-4090-4890-ba72-195a943a722b","Type":"ContainerStarted","Data":"6a9d0f19b67f372c651a9b41515f929c9b335605ed6f3d805d9bad12103d0aee"} Mar 18 15:57:32 crc kubenswrapper[4792]: I0318 15:57:32.746707 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gxvdh"] Mar 18 15:57:33 crc kubenswrapper[4792]: I0318 15:57:33.583453 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gxvdh" event={"ID":"04d7066d-b000-4a19-a381-60766be81585","Type":"ContainerStarted","Data":"e5641a060db5462e2876dbe755003aa767eff82a6cb16e7625f55e6aee0a6299"} Mar 18 15:57:34 crc kubenswrapper[4792]: I0318 15:57:34.597083 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb295773-c070-4d90-b351-cac7e8fa1017","Type":"ContainerStarted","Data":"9db38b350399a3fd5781c583df2d67440a7daaebad8edbde136ccbb2576667d5"} Mar 18 15:57:34 crc kubenswrapper[4792]: I0318 15:57:34.628676 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-5wbx7" podStartSLOduration=5.628633765 podStartE2EDuration="5.628633765s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:34.618372967 +0000 UTC m=+1403.487701914" watchObservedRunningTime="2026-03-18 15:57:34.628633765 +0000 UTC m=+1403.497962712" Mar 18 15:57:34 crc kubenswrapper[4792]: I0318 15:57:34.648333 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-da5b-account-create-update-t7nwf" podStartSLOduration=5.648303773 podStartE2EDuration="5.648303773s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:34.633277163 +0000 UTC m=+1403.502606120" watchObservedRunningTime="2026-03-18 15:57:34.648303773 +0000 UTC m=+1403.517632720" Mar 18 15:57:34 crc kubenswrapper[4792]: I0318 15:57:34.693612 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-65b5-account-create-update-d89tz" podStartSLOduration=6.693588759 podStartE2EDuration="6.693588759s" podCreationTimestamp="2026-03-18 15:57:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:34.661145173 +0000 UTC m=+1403.530474110" watchObservedRunningTime="2026-03-18 15:57:34.693588759 +0000 UTC m=+1403.562917696" Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.614678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gxvdh" event={"ID":"04d7066d-b000-4a19-a381-60766be81585","Type":"ContainerStarted","Data":"5e6d2747378d6eb32183d9d2e95f609701c942b1fa7a10ff138275778fcdc71e"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.618223 4792 generic.go:334] "Generic (PLEG): container finished" podID="3282b7a4-a673-4f26-9395-3fbcfe76fea4" containerID="dc0b9d721852424d88be1ae20b397b8b295c0f70a1ff28b55d95b176afb4fee5" exitCode=0 Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.618281 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5wbx7" event={"ID":"3282b7a4-a673-4f26-9395-3fbcfe76fea4","Type":"ContainerDied","Data":"dc0b9d721852424d88be1ae20b397b8b295c0f70a1ff28b55d95b176afb4fee5"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.630949 4792 generic.go:334] "Generic (PLEG): container finished" podID="2ea659dd-3f5f-4942-9c6c-ad15ec82bd58" containerID="8459a1a5f8dec12d555191105872ff4ec2ee9b3fd2d809b4dddb67fe98be2914" exitCode=0 Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.631053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rrksn" event={"ID":"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58","Type":"ContainerDied","Data":"8459a1a5f8dec12d555191105872ff4ec2ee9b3fd2d809b4dddb67fe98be2914"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.635629 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-644b-account-create-update-slgb4" event={"ID":"f14da68d-4090-4890-ba72-195a943a722b","Type":"ContainerStarted","Data":"017b0d5434ac40aa8e667a47bc17ea84d73b66c78bfeaa2d88307c0f7115fb8e"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.639318 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-gxvdh" podStartSLOduration=4.639295314 podStartE2EDuration="4.639295314s" podCreationTimestamp="2026-03-18 15:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:35.629832281 +0000 UTC m=+1404.499161228" watchObservedRunningTime="2026-03-18 15:57:35.639295314 +0000 UTC m=+1404.508624251" Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.640044 4792 generic.go:334] "Generic (PLEG): container finished" podID="ddad064f-85b2-4334-9b1f-af2e8037a328" containerID="39815edeed392e180c806a188fbbabdf3f1434bf2d373688b56d7e4407264003" exitCode=0 Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.640107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-65b5-account-create-update-d89tz" event={"ID":"ddad064f-85b2-4334-9b1f-af2e8037a328","Type":"ContainerDied","Data":"39815edeed392e180c806a188fbbabdf3f1434bf2d373688b56d7e4407264003"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.646537 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"c7b4dca7adf6d184603fc8cb8cf6b8feca1ff8d7d7d077be8a104bf654de71e4"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.646587 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"662099b7869abc201128d95bd880a5af6b6ef5fcad97b63cbad44dd7d59f92fd"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.646601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"bdbcbc0b3d63295b0dd34dc681ce9aa7151b8f0947df16175cf8734e3c12fcde"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.657767 4792 generic.go:334] "Generic (PLEG): container finished" podID="d46fc90c-8795-46d0-b6b4-e386c126ff37" containerID="828e7cc77860e7196d1cbe036ea1978e973b72a92ba3b6d8c571f80418c6e60a" exitCode=0 Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.657837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-da5b-account-create-update-t7nwf" event={"ID":"d46fc90c-8795-46d0-b6b4-e386c126ff37","Type":"ContainerDied","Data":"828e7cc77860e7196d1cbe036ea1978e973b72a92ba3b6d8c571f80418c6e60a"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.662217 4792 generic.go:334] "Generic (PLEG): container finished" podID="124caafa-8fb5-40be-b0bb-233a7848176f" containerID="664ad61c5e3740497e16c283a9ff433392cdb65c4e1a8dc0d57514d174a761d1" exitCode=0 Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.662289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb8pv" event={"ID":"124caafa-8fb5-40be-b0bb-233a7848176f","Type":"ContainerDied","Data":"664ad61c5e3740497e16c283a9ff433392cdb65c4e1a8dc0d57514d174a761d1"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.672212 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b0df-account-create-update-2r2xz" event={"ID":"6755e276-5f4a-45db-850e-97ff887e55ae","Type":"ContainerStarted","Data":"b506889efdf1ae7205f5d60ee0985b148c37a6c0003af6830978158b814dacfc"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.674472 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-644b-account-create-update-slgb4" podStartSLOduration=6.674454775 podStartE2EDuration="6.674454775s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:35.671634225 +0000 UTC m=+1404.540963172" watchObservedRunningTime="2026-03-18 15:57:35.674454775 +0000 UTC m=+1404.543783712" Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.675250 4792 generic.go:334] "Generic (PLEG): container finished" podID="162c4e7b-9b94-4363-aa4e-25cbb6cce669" containerID="3ebe52b0c9be12f9ba65dfc6e682f55a9d94b0898f08328ae8aaf62639e8df07" exitCode=0 Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.675306 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9t8k4" event={"ID":"162c4e7b-9b94-4363-aa4e-25cbb6cce669","Type":"ContainerDied","Data":"3ebe52b0c9be12f9ba65dfc6e682f55a9d94b0898f08328ae8aaf62639e8df07"} Mar 18 15:57:35 crc kubenswrapper[4792]: I0318 15:57:35.793704 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b0df-account-create-update-2r2xz" podStartSLOduration=6.793683452 podStartE2EDuration="6.793683452s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:35.785267653 +0000 UTC m=+1404.654596610" watchObservedRunningTime="2026-03-18 15:57:35.793683452 +0000 UTC m=+1404.663012389" Mar 18 15:57:36 crc kubenswrapper[4792]: I0318 15:57:36.685722 4792 generic.go:334] "Generic (PLEG): container finished" podID="04d7066d-b000-4a19-a381-60766be81585" containerID="5e6d2747378d6eb32183d9d2e95f609701c942b1fa7a10ff138275778fcdc71e" exitCode=0 Mar 18 15:57:36 crc kubenswrapper[4792]: I0318 15:57:36.685768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gxvdh" event={"ID":"04d7066d-b000-4a19-a381-60766be81585","Type":"ContainerDied","Data":"5e6d2747378d6eb32183d9d2e95f609701c942b1fa7a10ff138275778fcdc71e"} Mar 18 15:57:36 crc kubenswrapper[4792]: I0318 15:57:36.689571 4792 generic.go:334] "Generic (PLEG): container finished" podID="6755e276-5f4a-45db-850e-97ff887e55ae" containerID="b506889efdf1ae7205f5d60ee0985b148c37a6c0003af6830978158b814dacfc" exitCode=0 Mar 18 15:57:36 crc kubenswrapper[4792]: I0318 15:57:36.689716 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b0df-account-create-update-2r2xz" event={"ID":"6755e276-5f4a-45db-850e-97ff887e55ae","Type":"ContainerDied","Data":"b506889efdf1ae7205f5d60ee0985b148c37a6c0003af6830978158b814dacfc"} Mar 18 15:57:36 crc kubenswrapper[4792]: I0318 15:57:36.692023 4792 generic.go:334] "Generic (PLEG): container finished" podID="f14da68d-4090-4890-ba72-195a943a722b" containerID="017b0d5434ac40aa8e667a47bc17ea84d73b66c78bfeaa2d88307c0f7115fb8e" exitCode=0 Mar 18 15:57:36 crc kubenswrapper[4792]: I0318 15:57:36.692135 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-644b-account-create-update-slgb4" event={"ID":"f14da68d-4090-4890-ba72-195a943a722b","Type":"ContainerDied","Data":"017b0d5434ac40aa8e667a47bc17ea84d73b66c78bfeaa2d88307c0f7115fb8e"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.251371 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.434355 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162c4e7b-9b94-4363-aa4e-25cbb6cce669-operator-scripts\") pod \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.434872 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/162c4e7b-9b94-4363-aa4e-25cbb6cce669-kube-api-access-7828p\") pod \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\" (UID: \"162c4e7b-9b94-4363-aa4e-25cbb6cce669\") " Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.435143 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/162c4e7b-9b94-4363-aa4e-25cbb6cce669-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "162c4e7b-9b94-4363-aa4e-25cbb6cce669" (UID: "162c4e7b-9b94-4363-aa4e-25cbb6cce669"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.436571 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162c4e7b-9b94-4363-aa4e-25cbb6cce669-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.439923 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162c4e7b-9b94-4363-aa4e-25cbb6cce669-kube-api-access-7828p" (OuterVolumeSpecName: "kube-api-access-7828p") pod "162c4e7b-9b94-4363-aa4e-25cbb6cce669" (UID: "162c4e7b-9b94-4363-aa4e-25cbb6cce669"). InnerVolumeSpecName "kube-api-access-7828p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.539321 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/162c4e7b-9b94-4363-aa4e-25cbb6cce669-kube-api-access-7828p\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.731319 4792 generic.go:334] "Generic (PLEG): container finished" podID="fb295773-c070-4d90-b351-cac7e8fa1017" containerID="9db38b350399a3fd5781c583df2d67440a7daaebad8edbde136ccbb2576667d5" exitCode=0 Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.731381 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb295773-c070-4d90-b351-cac7e8fa1017","Type":"ContainerDied","Data":"9db38b350399a3fd5781c583df2d67440a7daaebad8edbde136ccbb2576667d5"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.734760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gxvdh" event={"ID":"04d7066d-b000-4a19-a381-60766be81585","Type":"ContainerDied","Data":"e5641a060db5462e2876dbe755003aa767eff82a6cb16e7625f55e6aee0a6299"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.734805 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5641a060db5462e2876dbe755003aa767eff82a6cb16e7625f55e6aee0a6299" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.736923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-da5b-account-create-update-t7nwf" event={"ID":"d46fc90c-8795-46d0-b6b4-e386c126ff37","Type":"ContainerDied","Data":"38876865f7c2f200914a27a717e7f33628b207504988b4f75c999a7f14f6eb55"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.736945 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38876865f7c2f200914a27a717e7f33628b207504988b4f75c999a7f14f6eb55" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.738595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9t8k4" event={"ID":"162c4e7b-9b94-4363-aa4e-25cbb6cce669","Type":"ContainerDied","Data":"9baf96070affea98129cdd99a103cb0ab974fb43c983c1fa7253b290f195920a"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.738615 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9baf96070affea98129cdd99a103cb0ab974fb43c983c1fa7253b290f195920a" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.738664 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9t8k4" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.747039 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-644b-account-create-update-slgb4" event={"ID":"f14da68d-4090-4890-ba72-195a943a722b","Type":"ContainerDied","Data":"6a9d0f19b67f372c651a9b41515f929c9b335605ed6f3d805d9bad12103d0aee"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.747090 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a9d0f19b67f372c651a9b41515f929c9b335605ed6f3d805d9bad12103d0aee" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.750762 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-65b5-account-create-update-d89tz" event={"ID":"ddad064f-85b2-4334-9b1f-af2e8037a328","Type":"ContainerDied","Data":"3865d629257727c1c4760f19d6657167f013e0da3c90a9269032276ef90e21ff"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.750795 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3865d629257727c1c4760f19d6657167f013e0da3c90a9269032276ef90e21ff" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.752492 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb8pv" event={"ID":"124caafa-8fb5-40be-b0bb-233a7848176f","Type":"ContainerDied","Data":"6e0c5b1310f169b551ac32f590b2794206ce9939255fd3b3d72fa1519e554ed1"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.752530 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e0c5b1310f169b551ac32f590b2794206ce9939255fd3b3d72fa1519e554ed1" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.754600 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5wbx7" event={"ID":"3282b7a4-a673-4f26-9395-3fbcfe76fea4","Type":"ContainerDied","Data":"97ccf14140d203e352e8ec2ca855e5731f9aeac09982fe395ed6857ca582be42"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.754627 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97ccf14140d203e352e8ec2ca855e5731f9aeac09982fe395ed6857ca582be42" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.756878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b0df-account-create-update-2r2xz" event={"ID":"6755e276-5f4a-45db-850e-97ff887e55ae","Type":"ContainerDied","Data":"78c76449dc612be450086d1f61ddf0530361bd5be8e05713ac026234a246835d"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.756907 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78c76449dc612be450086d1f61ddf0530361bd5be8e05713ac026234a246835d" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.758348 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rrksn" event={"ID":"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58","Type":"ContainerDied","Data":"1958c4d8161500fa0af4abaf325be4be865f40b160eb821d282bc5d5c0177a1c"} Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.758388 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1958c4d8161500fa0af4abaf325be4be865f40b160eb821d282bc5d5c0177a1c" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.796231 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.876241 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.906899 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.947764 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-operator-scripts\") pod \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.947901 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw7z7\" (UniqueName: \"kubernetes.io/projected/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-kube-api-access-sw7z7\") pod \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\" (UID: \"2ea659dd-3f5f-4942-9c6c-ad15ec82bd58\") " Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.948231 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.948453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ea659dd-3f5f-4942-9c6c-ad15ec82bd58" (UID: "2ea659dd-3f5f-4942-9c6c-ad15ec82bd58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.949108 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.949564 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.980395 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-kube-api-access-sw7z7" (OuterVolumeSpecName: "kube-api-access-sw7z7") pod "2ea659dd-3f5f-4942-9c6c-ad15ec82bd58" (UID: "2ea659dd-3f5f-4942-9c6c-ad15ec82bd58"). InnerVolumeSpecName "kube-api-access-sw7z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.986809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.988263 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:39 crc kubenswrapper[4792]: I0318 15:57:39.988331 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.053132 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm8nj\" (UniqueName: \"kubernetes.io/projected/3282b7a4-a673-4f26-9395-3fbcfe76fea4-kube-api-access-jm8nj\") pod \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.053553 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d46fc90c-8795-46d0-b6b4-e386c126ff37-operator-scripts\") pod \"d46fc90c-8795-46d0-b6b4-e386c126ff37\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.053655 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fssg2\" (UniqueName: \"kubernetes.io/projected/ddad064f-85b2-4334-9b1f-af2e8037a328-kube-api-access-fssg2\") pod \"ddad064f-85b2-4334-9b1f-af2e8037a328\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.053693 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3282b7a4-a673-4f26-9395-3fbcfe76fea4-operator-scripts\") pod \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\" (UID: \"3282b7a4-a673-4f26-9395-3fbcfe76fea4\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.053747 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/124caafa-8fb5-40be-b0bb-233a7848176f-operator-scripts\") pod \"124caafa-8fb5-40be-b0bb-233a7848176f\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.053806 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvkt6\" (UniqueName: \"kubernetes.io/projected/124caafa-8fb5-40be-b0bb-233a7848176f-kube-api-access-bvkt6\") pod \"124caafa-8fb5-40be-b0bb-233a7848176f\" (UID: \"124caafa-8fb5-40be-b0bb-233a7848176f\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.053949 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddad064f-85b2-4334-9b1f-af2e8037a328-operator-scripts\") pod \"ddad064f-85b2-4334-9b1f-af2e8037a328\" (UID: \"ddad064f-85b2-4334-9b1f-af2e8037a328\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.054012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw5hb\" (UniqueName: \"kubernetes.io/projected/d46fc90c-8795-46d0-b6b4-e386c126ff37-kube-api-access-fw5hb\") pod \"d46fc90c-8795-46d0-b6b4-e386c126ff37\" (UID: \"d46fc90c-8795-46d0-b6b4-e386c126ff37\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.054660 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw7z7\" (UniqueName: \"kubernetes.io/projected/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58-kube-api-access-sw7z7\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.055702 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46fc90c-8795-46d0-b6b4-e386c126ff37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d46fc90c-8795-46d0-b6b4-e386c126ff37" (UID: "d46fc90c-8795-46d0-b6b4-e386c126ff37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.055773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3282b7a4-a673-4f26-9395-3fbcfe76fea4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3282b7a4-a673-4f26-9395-3fbcfe76fea4" (UID: "3282b7a4-a673-4f26-9395-3fbcfe76fea4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.056938 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddad064f-85b2-4334-9b1f-af2e8037a328-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddad064f-85b2-4334-9b1f-af2e8037a328" (UID: "ddad064f-85b2-4334-9b1f-af2e8037a328"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.057129 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/124caafa-8fb5-40be-b0bb-233a7848176f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "124caafa-8fb5-40be-b0bb-233a7848176f" (UID: "124caafa-8fb5-40be-b0bb-233a7848176f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.063571 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3282b7a4-a673-4f26-9395-3fbcfe76fea4-kube-api-access-jm8nj" (OuterVolumeSpecName: "kube-api-access-jm8nj") pod "3282b7a4-a673-4f26-9395-3fbcfe76fea4" (UID: "3282b7a4-a673-4f26-9395-3fbcfe76fea4"). InnerVolumeSpecName "kube-api-access-jm8nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.067050 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddad064f-85b2-4334-9b1f-af2e8037a328-kube-api-access-fssg2" (OuterVolumeSpecName: "kube-api-access-fssg2") pod "ddad064f-85b2-4334-9b1f-af2e8037a328" (UID: "ddad064f-85b2-4334-9b1f-af2e8037a328"). InnerVolumeSpecName "kube-api-access-fssg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.070065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46fc90c-8795-46d0-b6b4-e386c126ff37-kube-api-access-fw5hb" (OuterVolumeSpecName: "kube-api-access-fw5hb") pod "d46fc90c-8795-46d0-b6b4-e386c126ff37" (UID: "d46fc90c-8795-46d0-b6b4-e386c126ff37"). InnerVolumeSpecName "kube-api-access-fw5hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.081993 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124caafa-8fb5-40be-b0bb-233a7848176f-kube-api-access-bvkt6" (OuterVolumeSpecName: "kube-api-access-bvkt6") pod "124caafa-8fb5-40be-b0bb-233a7848176f" (UID: "124caafa-8fb5-40be-b0bb-233a7848176f"). InnerVolumeSpecName "kube-api-access-bvkt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.156059 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnh9h\" (UniqueName: \"kubernetes.io/projected/f14da68d-4090-4890-ba72-195a943a722b-kube-api-access-cnh9h\") pod \"f14da68d-4090-4890-ba72-195a943a722b\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.156157 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdmtn\" (UniqueName: \"kubernetes.io/projected/6755e276-5f4a-45db-850e-97ff887e55ae-kube-api-access-mdmtn\") pod \"6755e276-5f4a-45db-850e-97ff887e55ae\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.156312 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14da68d-4090-4890-ba72-195a943a722b-operator-scripts\") pod \"f14da68d-4090-4890-ba72-195a943a722b\" (UID: \"f14da68d-4090-4890-ba72-195a943a722b\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.156373 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6755e276-5f4a-45db-850e-97ff887e55ae-operator-scripts\") pod \"6755e276-5f4a-45db-850e-97ff887e55ae\" (UID: \"6755e276-5f4a-45db-850e-97ff887e55ae\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.156404 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04d7066d-b000-4a19-a381-60766be81585-operator-scripts\") pod \"04d7066d-b000-4a19-a381-60766be81585\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.156550 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2jwg\" (UniqueName: \"kubernetes.io/projected/04d7066d-b000-4a19-a381-60766be81585-kube-api-access-r2jwg\") pod \"04d7066d-b000-4a19-a381-60766be81585\" (UID: \"04d7066d-b000-4a19-a381-60766be81585\") " Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157160 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f14da68d-4090-4890-ba72-195a943a722b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f14da68d-4090-4890-ba72-195a943a722b" (UID: "f14da68d-4090-4890-ba72-195a943a722b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157180 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm8nj\" (UniqueName: \"kubernetes.io/projected/3282b7a4-a673-4f26-9395-3fbcfe76fea4-kube-api-access-jm8nj\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157239 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d46fc90c-8795-46d0-b6b4-e386c126ff37-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157255 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fssg2\" (UniqueName: \"kubernetes.io/projected/ddad064f-85b2-4334-9b1f-af2e8037a328-kube-api-access-fssg2\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157268 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3282b7a4-a673-4f26-9395-3fbcfe76fea4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157280 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/124caafa-8fb5-40be-b0bb-233a7848176f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157293 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvkt6\" (UniqueName: \"kubernetes.io/projected/124caafa-8fb5-40be-b0bb-233a7848176f-kube-api-access-bvkt6\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157304 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddad064f-85b2-4334-9b1f-af2e8037a328-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157482 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw5hb\" (UniqueName: \"kubernetes.io/projected/d46fc90c-8795-46d0-b6b4-e386c126ff37-kube-api-access-fw5hb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d7066d-b000-4a19-a381-60766be81585-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04d7066d-b000-4a19-a381-60766be81585" (UID: "04d7066d-b000-4a19-a381-60766be81585"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.157856 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6755e276-5f4a-45db-850e-97ff887e55ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6755e276-5f4a-45db-850e-97ff887e55ae" (UID: "6755e276-5f4a-45db-850e-97ff887e55ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.161493 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f14da68d-4090-4890-ba72-195a943a722b-kube-api-access-cnh9h" (OuterVolumeSpecName: "kube-api-access-cnh9h") pod "f14da68d-4090-4890-ba72-195a943a722b" (UID: "f14da68d-4090-4890-ba72-195a943a722b"). InnerVolumeSpecName "kube-api-access-cnh9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.161735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6755e276-5f4a-45db-850e-97ff887e55ae-kube-api-access-mdmtn" (OuterVolumeSpecName: "kube-api-access-mdmtn") pod "6755e276-5f4a-45db-850e-97ff887e55ae" (UID: "6755e276-5f4a-45db-850e-97ff887e55ae"). InnerVolumeSpecName "kube-api-access-mdmtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.163052 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d7066d-b000-4a19-a381-60766be81585-kube-api-access-r2jwg" (OuterVolumeSpecName: "kube-api-access-r2jwg") pod "04d7066d-b000-4a19-a381-60766be81585" (UID: "04d7066d-b000-4a19-a381-60766be81585"). InnerVolumeSpecName "kube-api-access-r2jwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.259740 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14da68d-4090-4890-ba72-195a943a722b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.260047 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6755e276-5f4a-45db-850e-97ff887e55ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.260061 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04d7066d-b000-4a19-a381-60766be81585-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.260075 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2jwg\" (UniqueName: \"kubernetes.io/projected/04d7066d-b000-4a19-a381-60766be81585-kube-api-access-r2jwg\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.260087 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnh9h\" (UniqueName: \"kubernetes.io/projected/f14da68d-4090-4890-ba72-195a943a722b-kube-api-access-cnh9h\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.260098 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdmtn\" (UniqueName: \"kubernetes.io/projected/6755e276-5f4a-45db-850e-97ff887e55ae-kube-api-access-mdmtn\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.768299 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mqtz6" event={"ID":"bc4c925f-3e43-4351-9703-7fdd44a1a9d6","Type":"ContainerStarted","Data":"938cb6b02e3b47a9aba8104a14d29e0e6a3726f4126e5dd07b212afdf79a9deb"} Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.773128 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"0b9396c1256ab3d40d2e5d0fdb5bc67c2fb40adf8e8eb86cb7b698fcb40da31a"} Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.773332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"0626d7581e5c6c3198505c4065d89fc9c410aba10dae67b0b664169a2953bc17"} Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.776901 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb295773-c070-4d90-b351-cac7e8fa1017","Type":"ContainerStarted","Data":"732444a1f06c4e3a11034ea943f40e2431ac5d9c49afd9d2f7096d0ebaa3f61e"} Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779089 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b0df-account-create-update-2r2xz" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779105 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-da5b-account-create-update-t7nwf" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779118 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rrksn" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779126 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb8pv" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779145 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5x8d9" event={"ID":"7efa06c0-f363-43b0-ba89-96d41ff9db74","Type":"ContainerStarted","Data":"c9f0870267e670b6ce6bdd76570bef0e437e36b8cc0b28bec08ec5e5531395a3"} Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779245 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-65b5-account-create-update-d89tz" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779250 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5wbx7" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779098 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-644b-account-create-update-slgb4" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.779105 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxvdh" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.795556 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mqtz6" podStartSLOduration=6.177335242 podStartE2EDuration="41.795532819s" podCreationTimestamp="2026-03-18 15:56:59 +0000 UTC" firstStartedPulling="2026-03-18 15:57:04.068941474 +0000 UTC m=+1372.938270411" lastFinishedPulling="2026-03-18 15:57:39.687139051 +0000 UTC m=+1408.556467988" observedRunningTime="2026-03-18 15:57:40.794136624 +0000 UTC m=+1409.663465581" watchObservedRunningTime="2026-03-18 15:57:40.795532819 +0000 UTC m=+1409.664861756" Mar 18 15:57:40 crc kubenswrapper[4792]: I0318 15:57:40.825903 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5x8d9" podStartSLOduration=3.624491136 podStartE2EDuration="11.825881687s" podCreationTimestamp="2026-03-18 15:57:29 +0000 UTC" firstStartedPulling="2026-03-18 15:57:31.371934318 +0000 UTC m=+1400.241263255" lastFinishedPulling="2026-03-18 15:57:39.573324869 +0000 UTC m=+1408.442653806" observedRunningTime="2026-03-18 15:57:40.812346775 +0000 UTC m=+1409.681675712" watchObservedRunningTime="2026-03-18 15:57:40.825881687 +0000 UTC m=+1409.695210634" Mar 18 15:57:41 crc kubenswrapper[4792]: I0318 15:57:41.793318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"33f3ecccf49968cf35e3e83e383cd8019d04f214f63a9d3dc5d1e050fe387e20"} Mar 18 15:57:41 crc kubenswrapper[4792]: I0318 15:57:41.793631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"0413ce5a49f5e5471c96ecdb287e6902339831dcdd6268317e2002907dfa6eac"} Mar 18 15:57:43 crc kubenswrapper[4792]: I0318 15:57:43.021758 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gxvdh"] Mar 18 15:57:43 crc kubenswrapper[4792]: I0318 15:57:43.031628 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gxvdh"] Mar 18 15:57:43 crc kubenswrapper[4792]: I0318 15:57:43.815313 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"1bd1539da978e13319bfb4282c45ac7ceb72914bee1fcaaeee77d8bf2974ad74"} Mar 18 15:57:43 crc kubenswrapper[4792]: I0318 15:57:43.824869 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb295773-c070-4d90-b351-cac7e8fa1017","Type":"ContainerStarted","Data":"893413061c267926499bfbb2244ecb9e93e0336208f0f83770b8b32263660cae"} Mar 18 15:57:43 crc kubenswrapper[4792]: I0318 15:57:43.825179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb295773-c070-4d90-b351-cac7e8fa1017","Type":"ContainerStarted","Data":"6917e4db270b48ceba23a6947ace876efc6bcc6904cf24308b5d2762d07be2d6"} Mar 18 15:57:43 crc kubenswrapper[4792]: I0318 15:57:43.853484 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.853467291 podStartE2EDuration="16.853467291s" podCreationTimestamp="2026-03-18 15:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:43.849948269 +0000 UTC m=+1412.719277216" watchObservedRunningTime="2026-03-18 15:57:43.853467291 +0000 UTC m=+1412.722796228" Mar 18 15:57:43 crc kubenswrapper[4792]: I0318 15:57:43.872014 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d7066d-b000-4a19-a381-60766be81585" path="/var/lib/kubelet/pods/04d7066d-b000-4a19-a381-60766be81585/volumes" Mar 18 15:57:44 crc kubenswrapper[4792]: I0318 15:57:44.851776 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"f02163c0e50b8e313c5795cbd105bb0ee20374b57f00d9ecf8bc8f140114e912"} Mar 18 15:57:44 crc kubenswrapper[4792]: I0318 15:57:44.852677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"e03c68a12b24ca94cfa59fa5c26869d58b2f01dd88fc3b58216db34b9c788b97"} Mar 18 15:57:44 crc kubenswrapper[4792]: I0318 15:57:44.852727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"9986ab7b69957423c509ae524ac5cff7dee637b8be0eb5d39eb6f77b66eaac04"} Mar 18 15:57:44 crc kubenswrapper[4792]: I0318 15:57:44.852743 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"e359c954c1099fdf950bba634a7f7fbaf143e92a4ec75cbf6fb1c2fcc54763f8"} Mar 18 15:57:45 crc kubenswrapper[4792]: I0318 15:57:45.870546 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"f90b3c2fb6a28f43aeb26218003463e168cfd0b00eae92eb632a8f4b1abe74ad"} Mar 18 15:57:45 crc kubenswrapper[4792]: I0318 15:57:45.870867 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3c345496-7b4e-41f0-a5ae-4c503e452221","Type":"ContainerStarted","Data":"bd6252269df1057e7e4d671af82e70f79883f3c6d88583dab4173fe28d80e37d"} Mar 18 15:57:45 crc kubenswrapper[4792]: I0318 15:57:45.928450 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.262170709 podStartE2EDuration="53.92842735s" podCreationTimestamp="2026-03-18 15:56:52 +0000 UTC" firstStartedPulling="2026-03-18 15:57:26.838875602 +0000 UTC m=+1395.708204539" lastFinishedPulling="2026-03-18 15:57:43.505132243 +0000 UTC m=+1412.374461180" observedRunningTime="2026-03-18 15:57:45.918489992 +0000 UTC m=+1414.787818939" watchObservedRunningTime="2026-03-18 15:57:45.92842735 +0000 UTC m=+1414.797756287" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.317193 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jb5s7"] Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318002 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46fc90c-8795-46d0-b6b4-e386c126ff37" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318024 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46fc90c-8795-46d0-b6b4-e386c126ff37" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318041 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14da68d-4090-4890-ba72-195a943a722b" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318048 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14da68d-4090-4890-ba72-195a943a722b" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318061 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea659dd-3f5f-4942-9c6c-ad15ec82bd58" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318068 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea659dd-3f5f-4942-9c6c-ad15ec82bd58" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318084 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3282b7a4-a673-4f26-9395-3fbcfe76fea4" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318090 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3282b7a4-a673-4f26-9395-3fbcfe76fea4" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318100 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d7066d-b000-4a19-a381-60766be81585" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318106 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d7066d-b000-4a19-a381-60766be81585" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318127 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddad064f-85b2-4334-9b1f-af2e8037a328" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318133 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddad064f-85b2-4334-9b1f-af2e8037a328" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318146 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6755e276-5f4a-45db-850e-97ff887e55ae" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318152 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6755e276-5f4a-45db-850e-97ff887e55ae" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318162 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162c4e7b-9b94-4363-aa4e-25cbb6cce669" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318169 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="162c4e7b-9b94-4363-aa4e-25cbb6cce669" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: E0318 15:57:46.318184 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124caafa-8fb5-40be-b0bb-233a7848176f" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318189 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="124caafa-8fb5-40be-b0bb-233a7848176f" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318371 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46fc90c-8795-46d0-b6b4-e386c126ff37" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318381 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="124caafa-8fb5-40be-b0bb-233a7848176f" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318391 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddad064f-85b2-4334-9b1f-af2e8037a328" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318398 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d7066d-b000-4a19-a381-60766be81585" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318408 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea659dd-3f5f-4942-9c6c-ad15ec82bd58" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318417 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3282b7a4-a673-4f26-9395-3fbcfe76fea4" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318425 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6755e276-5f4a-45db-850e-97ff887e55ae" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318432 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f14da68d-4090-4890-ba72-195a943a722b" containerName="mariadb-account-create-update" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.318458 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="162c4e7b-9b94-4363-aa4e-25cbb6cce669" containerName="mariadb-database-create" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.319708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.322727 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.327531 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jb5s7"] Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.432921 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-config\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.433111 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68k65\" (UniqueName: \"kubernetes.io/projected/9695440d-ffce-4276-ac24-83557d237c18-kube-api-access-68k65\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.433170 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.433190 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.433224 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.433490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.535195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-config\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.535291 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68k65\" (UniqueName: \"kubernetes.io/projected/9695440d-ffce-4276-ac24-83557d237c18-kube-api-access-68k65\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.535332 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.535355 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.535386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.535462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.536311 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-config\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.536613 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.536638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.536702 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.537012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.556401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68k65\" (UniqueName: \"kubernetes.io/projected/9695440d-ffce-4276-ac24-83557d237c18-kube-api-access-68k65\") pod \"dnsmasq-dns-5c79d794d7-jb5s7\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.638157 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.883022 4792 generic.go:334] "Generic (PLEG): container finished" podID="7efa06c0-f363-43b0-ba89-96d41ff9db74" containerID="c9f0870267e670b6ce6bdd76570bef0e437e36b8cc0b28bec08ec5e5531395a3" exitCode=0 Mar 18 15:57:46 crc kubenswrapper[4792]: I0318 15:57:46.883400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5x8d9" event={"ID":"7efa06c0-f363-43b0-ba89-96d41ff9db74","Type":"ContainerDied","Data":"c9f0870267e670b6ce6bdd76570bef0e437e36b8cc0b28bec08ec5e5531395a3"} Mar 18 15:57:47 crc kubenswrapper[4792]: W0318 15:57:47.122312 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9695440d_ffce_4276_ac24_83557d237c18.slice/crio-abc0ca3840b99eac95df52fcf43956add32b54185b98976680d13ea033227d1f WatchSource:0}: Error finding container abc0ca3840b99eac95df52fcf43956add32b54185b98976680d13ea033227d1f: Status 404 returned error can't find the container with id abc0ca3840b99eac95df52fcf43956add32b54185b98976680d13ea033227d1f Mar 18 15:57:47 crc kubenswrapper[4792]: I0318 15:57:47.131641 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jb5s7"] Mar 18 15:57:47 crc kubenswrapper[4792]: I0318 15:57:47.896738 4792 generic.go:334] "Generic (PLEG): container finished" podID="9695440d-ffce-4276-ac24-83557d237c18" containerID="aae09ff38531e8da73a2e2ae05e95bb479d4e2180a3c41758911ba4493b8b76d" exitCode=0 Mar 18 15:57:47 crc kubenswrapper[4792]: I0318 15:57:47.896833 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" event={"ID":"9695440d-ffce-4276-ac24-83557d237c18","Type":"ContainerDied","Data":"aae09ff38531e8da73a2e2ae05e95bb479d4e2180a3c41758911ba4493b8b76d"} Mar 18 15:57:47 crc kubenswrapper[4792]: I0318 15:57:47.897121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" event={"ID":"9695440d-ffce-4276-ac24-83557d237c18","Type":"ContainerStarted","Data":"abc0ca3840b99eac95df52fcf43956add32b54185b98976680d13ea033227d1f"} Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.080648 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9zvzr"] Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.082938 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.085385 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.092833 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9zvzr"] Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.181362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84be4145-eccb-4329-a407-0ecb688a6b20-operator-scripts\") pod \"root-account-create-update-9zvzr\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.181447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27mwq\" (UniqueName: \"kubernetes.io/projected/84be4145-eccb-4329-a407-0ecb688a6b20-kube-api-access-27mwq\") pod \"root-account-create-update-9zvzr\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.208125 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.283389 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84be4145-eccb-4329-a407-0ecb688a6b20-operator-scripts\") pod \"root-account-create-update-9zvzr\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.283543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27mwq\" (UniqueName: \"kubernetes.io/projected/84be4145-eccb-4329-a407-0ecb688a6b20-kube-api-access-27mwq\") pod \"root-account-create-update-9zvzr\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.284595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84be4145-eccb-4329-a407-0ecb688a6b20-operator-scripts\") pod \"root-account-create-update-9zvzr\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.305811 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27mwq\" (UniqueName: \"kubernetes.io/projected/84be4145-eccb-4329-a407-0ecb688a6b20-kube-api-access-27mwq\") pod \"root-account-create-update-9zvzr\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.311785 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.385444 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-config-data\") pod \"7efa06c0-f363-43b0-ba89-96d41ff9db74\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.385502 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-combined-ca-bundle\") pod \"7efa06c0-f363-43b0-ba89-96d41ff9db74\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.385625 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np4r5\" (UniqueName: \"kubernetes.io/projected/7efa06c0-f363-43b0-ba89-96d41ff9db74-kube-api-access-np4r5\") pod \"7efa06c0-f363-43b0-ba89-96d41ff9db74\" (UID: \"7efa06c0-f363-43b0-ba89-96d41ff9db74\") " Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.396768 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7efa06c0-f363-43b0-ba89-96d41ff9db74-kube-api-access-np4r5" (OuterVolumeSpecName: "kube-api-access-np4r5") pod "7efa06c0-f363-43b0-ba89-96d41ff9db74" (UID: "7efa06c0-f363-43b0-ba89-96d41ff9db74"). InnerVolumeSpecName "kube-api-access-np4r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.412467 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7efa06c0-f363-43b0-ba89-96d41ff9db74" (UID: "7efa06c0-f363-43b0-ba89-96d41ff9db74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.428576 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.449855 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-config-data" (OuterVolumeSpecName: "config-data") pod "7efa06c0-f363-43b0-ba89-96d41ff9db74" (UID: "7efa06c0-f363-43b0-ba89-96d41ff9db74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.488540 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.488594 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efa06c0-f363-43b0-ba89-96d41ff9db74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.488612 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np4r5\" (UniqueName: \"kubernetes.io/projected/7efa06c0-f363-43b0-ba89-96d41ff9db74-kube-api-access-np4r5\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.908367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5x8d9" event={"ID":"7efa06c0-f363-43b0-ba89-96d41ff9db74","Type":"ContainerDied","Data":"e7fe498e82aaff81599837bc25c7937f5be2db4cf341b727d9c1592082e95af6"} Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.908662 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7fe498e82aaff81599837bc25c7937f5be2db4cf341b727d9c1592082e95af6" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.908716 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5x8d9" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.915217 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" event={"ID":"9695440d-ffce-4276-ac24-83557d237c18","Type":"ContainerStarted","Data":"6c49758f4c1b773f01aeb16cffd4ddf51ec60072dc4d10385330fb314803e60d"} Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.916276 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.947416 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9zvzr"] Mar 18 15:57:48 crc kubenswrapper[4792]: I0318 15:57:48.953095 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" podStartSLOduration=2.95307943 podStartE2EDuration="2.95307943s" podCreationTimestamp="2026-03-18 15:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:48.948343439 +0000 UTC m=+1417.817672376" watchObservedRunningTime="2026-03-18 15:57:48.95307943 +0000 UTC m=+1417.822408367" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.172864 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jb5s7"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.196081 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-79hcn"] Mar 18 15:57:49 crc kubenswrapper[4792]: E0318 15:57:49.196599 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7efa06c0-f363-43b0-ba89-96d41ff9db74" containerName="keystone-db-sync" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.196617 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7efa06c0-f363-43b0-ba89-96d41ff9db74" containerName="keystone-db-sync" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.196893 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7efa06c0-f363-43b0-ba89-96d41ff9db74" containerName="keystone-db-sync" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.197934 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.216701 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.216771 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.217006 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.217226 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.217439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-b84g6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.253780 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-79hcn"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.288166 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-wmlqd"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.290546 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.339288 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-config-data\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.339780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-scripts\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.339826 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-credential-keys\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.339923 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-fernet-keys\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.340249 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-combined-ca-bundle\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.340284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mh7r\" (UniqueName: \"kubernetes.io/projected/a1876da4-5117-4c39-ab2d-51f877341c43-kube-api-access-7mh7r\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.376782 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-pfgr6"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.405438 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.410141 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-mncxd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.411024 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446334 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-combined-ca-bundle\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mh7r\" (UniqueName: \"kubernetes.io/projected/a1876da4-5117-4c39-ab2d-51f877341c43-kube-api-access-7mh7r\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjmh\" (UniqueName: \"kubernetes.io/projected/e56d720e-4b84-4766-beec-f450aba33d28-kube-api-access-ktjmh\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446541 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446569 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-config-data\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446701 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-scripts\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-credential-keys\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-config\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446822 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-svc\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.446852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-fernet-keys\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.472122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-scripts\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.513331 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-wmlqd"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.515566 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-config-data\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.518517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-credential-keys\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.537524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-combined-ca-bundle\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.552150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.558012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560193 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mh7r\" (UniqueName: \"kubernetes.io/projected/a1876da4-5117-4c39-ab2d-51f877341c43-kube-api-access-7mh7r\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560345 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqdmz\" (UniqueName: \"kubernetes.io/projected/379ff25e-6c0a-45d4-a478-87a5e136aa47-kube-api-access-qqdmz\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560519 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-config\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-svc\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-config-data\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-combined-ca-bundle\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.560851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjmh\" (UniqueName: \"kubernetes.io/projected/e56d720e-4b84-4766-beec-f450aba33d28-kube-api-access-ktjmh\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.561908 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.569023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.569696 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-svc\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.574611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-config\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.621882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-fernet-keys\") pod \"keystone-bootstrap-79hcn\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.640014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjmh\" (UniqueName: \"kubernetes.io/projected/e56d720e-4b84-4766-beec-f450aba33d28-kube-api-access-ktjmh\") pod \"dnsmasq-dns-5b868669f-wmlqd\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.657388 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-pfgr6"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.663946 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-combined-ca-bundle\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.664214 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqdmz\" (UniqueName: \"kubernetes.io/projected/379ff25e-6c0a-45d4-a478-87a5e136aa47-kube-api-access-qqdmz\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.664395 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-config-data\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.669938 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-combined-ca-bundle\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.676378 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-config-data\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.710371 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.753528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqdmz\" (UniqueName: \"kubernetes.io/projected/379ff25e-6c0a-45d4-a478-87a5e136aa47-kube-api-access-qqdmz\") pod \"heat-db-sync-pfgr6\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.814952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-tjtqm"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.816860 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.817042 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.821864 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tjtqm"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.825188 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.826072 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dg784" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.826199 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.827027 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pfgr6" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.850096 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ncpfm"] Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.851909 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.871167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ddm\" (UniqueName: \"kubernetes.io/projected/e5416678-c001-4966-a019-56f29f29adc7-kube-api-access-w4ddm\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.871358 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-combined-ca-bundle\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.871386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-config\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.885668 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fr4wl" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.885891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.886015 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.975708 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-config\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976535 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-db-sync-config-data\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976588 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-combined-ca-bundle\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976606 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lbm4\" (UniqueName: \"kubernetes.io/projected/6967171c-e427-4723-ae1e-25e3bad61d59-kube-api-access-8lbm4\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976663 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ddm\" (UniqueName: \"kubernetes.io/projected/e5416678-c001-4966-a019-56f29f29adc7-kube-api-access-w4ddm\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976686 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-config-data\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976747 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6967171c-e427-4723-ae1e-25e3bad61d59-etc-machine-id\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976822 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-scripts\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:49 crc kubenswrapper[4792]: I0318 15:57:49.976998 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-combined-ca-bundle\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.018751 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-k2l2s"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.021845 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ncpfm"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.022140 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.022175 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k2l2s"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.028065 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-config\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.029937 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qfds7" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.033679 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.050025 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-wmlqd"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.085174 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-dhpb9"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.087224 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.094392 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-db-sync-config-data\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.095039 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmkj\" (UniqueName: \"kubernetes.io/projected/2f4cf6d4-998f-445b-82ed-25b2b4670875-kube-api-access-4hmkj\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.095642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-combined-ca-bundle\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.095807 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lbm4\" (UniqueName: \"kubernetes.io/projected/6967171c-e427-4723-ae1e-25e3bad61d59-kube-api-access-8lbm4\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.096065 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-config-data\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.096250 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6967171c-e427-4723-ae1e-25e3bad61d59-etc-machine-id\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.096499 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-scripts\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.096729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-combined-ca-bundle\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.097102 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-db-sync-config-data\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.102306 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6967171c-e427-4723-ae1e-25e3bad61d59-etc-machine-id\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.103057 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-dhpb9"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.107523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9zvzr" event={"ID":"84be4145-eccb-4329-a407-0ecb688a6b20","Type":"ContainerStarted","Data":"2fa8370ff2792528e01c0f49ffb6411dafef9e5ac1bf12e7f35dab1dd3e59640"} Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.107590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9zvzr" event={"ID":"84be4145-eccb-4329-a407-0ecb688a6b20","Type":"ContainerStarted","Data":"05a0959eb5176e7c63585d2b1e47c46d9109e4f969e1dbbc73e0587709afa334"} Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.131628 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jjwfp"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.133521 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.137015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-combined-ca-bundle\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.140843 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.141374 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zwhbp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.141647 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.146877 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jjwfp"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.152797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ddm\" (UniqueName: \"kubernetes.io/projected/e5416678-c001-4966-a019-56f29f29adc7-kube-api-access-w4ddm\") pod \"neutron-db-sync-tjtqm\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-db-sync-config-data\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202630 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmkj\" (UniqueName: \"kubernetes.io/projected/2f4cf6d4-998f-445b-82ed-25b2b4670875-kube-api-access-4hmkj\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202682 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-combined-ca-bundle\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202707 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-config\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202741 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202777 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202814 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-scripts\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202861 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-config-data\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202897 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sk9q\" (UniqueName: \"kubernetes.io/projected/6a148250-156e-4b40-969e-3569cea8a403-kube-api-access-4sk9q\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202942 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a148250-156e-4b40-969e-3569cea8a403-logs\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.202990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268r8\" (UniqueName: \"kubernetes.io/projected/ff35f858-03ee-44eb-8333-2f426efa6281-kube-api-access-268r8\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.203026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-combined-ca-bundle\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.203072 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.203140 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-svc\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.210768 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-9zvzr" podStartSLOduration=2.210741162 podStartE2EDuration="2.210741162s" podCreationTimestamp="2026-03-18 15:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:50.140883453 +0000 UTC m=+1419.010212390" watchObservedRunningTime="2026-03-18 15:57:50.210741162 +0000 UTC m=+1419.080070099" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.234296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmkj\" (UniqueName: \"kubernetes.io/projected/2f4cf6d4-998f-445b-82ed-25b2b4670875-kube-api-access-4hmkj\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.236502 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-db-sync-config-data\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.237857 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-combined-ca-bundle\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.250685 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-combined-ca-bundle\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.251043 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-scripts\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.251108 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-db-sync-config-data\") pod \"barbican-db-sync-k2l2s\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.252309 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lbm4\" (UniqueName: \"kubernetes.io/projected/6967171c-e427-4723-ae1e-25e3bad61d59-kube-api-access-8lbm4\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.265836 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-config-data\") pod \"cinder-db-sync-ncpfm\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.284613 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.318548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.319128 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-scripts\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.319257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-config-data\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.319350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sk9q\" (UniqueName: \"kubernetes.io/projected/6a148250-156e-4b40-969e-3569cea8a403-kube-api-access-4sk9q\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.319485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a148250-156e-4b40-969e-3569cea8a403-logs\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.319594 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-268r8\" (UniqueName: \"kubernetes.io/projected/ff35f858-03ee-44eb-8333-2f426efa6281-kube-api-access-268r8\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.319714 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.319821 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-svc\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.320035 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-combined-ca-bundle\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.320136 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-config\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.320250 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.321416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.321620 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a148250-156e-4b40-969e-3569cea8a403-logs\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.320170 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.322695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-svc\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.323770 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.328998 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-config\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.340917 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-scripts\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.342330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-config-data\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.348362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-combined-ca-bundle\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.348894 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.368288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sk9q\" (UniqueName: \"kubernetes.io/projected/6a148250-156e-4b40-969e-3569cea8a403-kube-api-access-4sk9q\") pod \"placement-db-sync-jjwfp\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.369870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-268r8\" (UniqueName: \"kubernetes.io/projected/ff35f858-03ee-44eb-8333-2f426efa6281-kube-api-access-268r8\") pod \"dnsmasq-dns-cf78879c9-dhpb9\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.461641 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.488170 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jjwfp" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.551236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.649673 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.652586 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.657057 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.657224 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.707463 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.732156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.732285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csr85\" (UniqueName: \"kubernetes.io/projected/bd1c490b-f876-4095-9ae8-8280d3ce02c9-kube-api-access-csr85\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.732355 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-config-data\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.732385 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-log-httpd\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.732446 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.732483 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-run-httpd\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.732539 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-scripts\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.834598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csr85\" (UniqueName: \"kubernetes.io/projected/bd1c490b-f876-4095-9ae8-8280d3ce02c9-kube-api-access-csr85\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.834721 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-config-data\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.834746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-log-httpd\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.834819 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.834867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-run-httpd\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.834943 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-scripts\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.835015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.839428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-run-httpd\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.840015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-log-httpd\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.856426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-scripts\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.871000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.872911 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.877236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-config-data\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:50 crc kubenswrapper[4792]: I0318 15:57:50.881668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csr85\" (UniqueName: \"kubernetes.io/projected/bd1c490b-f876-4095-9ae8-8280d3ce02c9-kube-api-access-csr85\") pod \"ceilometer-0\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " pod="openstack/ceilometer-0" Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.028998 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-wmlqd"] Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.068329 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-79hcn"] Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.155035 4792 generic.go:334] "Generic (PLEG): container finished" podID="84be4145-eccb-4329-a407-0ecb688a6b20" containerID="2fa8370ff2792528e01c0f49ffb6411dafef9e5ac1bf12e7f35dab1dd3e59640" exitCode=0 Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.155273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9zvzr" event={"ID":"84be4145-eccb-4329-a407-0ecb688a6b20","Type":"ContainerDied","Data":"2fa8370ff2792528e01c0f49ffb6411dafef9e5ac1bf12e7f35dab1dd3e59640"} Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.156855 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79hcn" event={"ID":"a1876da4-5117-4c39-ab2d-51f877341c43","Type":"ContainerStarted","Data":"0fabac6215278783fdd76b159edb98d6ca7e8e1543b6c43df5924c3e9135ba58"} Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.157611 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.189993 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" podUID="9695440d-ffce-4276-ac24-83557d237c18" containerName="dnsmasq-dns" containerID="cri-o://6c49758f4c1b773f01aeb16cffd4ddf51ec60072dc4d10385330fb314803e60d" gracePeriod=10 Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.190469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-wmlqd" event={"ID":"e56d720e-4b84-4766-beec-f450aba33d28","Type":"ContainerStarted","Data":"15cdc7a615feab424a530e40c92a31e1816edf0e6072f5350110e627d8cf5722"} Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.393850 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-pfgr6"] Mar 18 15:57:51 crc kubenswrapper[4792]: W0318 15:57:51.417789 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod379ff25e_6c0a_45d4_a478_87a5e136aa47.slice/crio-b864647b670415dbf3392e795c29ad9f260f453409badad4631dd08ed392c99d WatchSource:0}: Error finding container b864647b670415dbf3392e795c29ad9f260f453409badad4631dd08ed392c99d: Status 404 returned error can't find the container with id b864647b670415dbf3392e795c29ad9f260f453409badad4631dd08ed392c99d Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.910282 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k2l2s"] Mar 18 15:57:51 crc kubenswrapper[4792]: I0318 15:57:51.926120 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tjtqm"] Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.232617 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jjwfp"] Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.281777 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2l2s" event={"ID":"2f4cf6d4-998f-445b-82ed-25b2b4670875","Type":"ContainerStarted","Data":"82dd6283f71e0d4188a5e633e1108f38df1dfe9f745fd16552c9f21e82206f4f"} Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.311278 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ncpfm"] Mar 18 15:57:52 crc kubenswrapper[4792]: W0318 15:57:52.312438 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6967171c_e427_4723_ae1e_25e3bad61d59.slice/crio-c002043694cb4ff2d58d14c6dcc32093e1ce955dd9ecf417032b4dd31e4ba21c WatchSource:0}: Error finding container c002043694cb4ff2d58d14c6dcc32093e1ce955dd9ecf417032b4dd31e4ba21c: Status 404 returned error can't find the container with id c002043694cb4ff2d58d14c6dcc32093e1ce955dd9ecf417032b4dd31e4ba21c Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.326142 4792 generic.go:334] "Generic (PLEG): container finished" podID="e56d720e-4b84-4766-beec-f450aba33d28" containerID="5fa7a1547603e7ed2d2ce6d76a593f2aa91a220ad620908e5a2956fe8c5f5cf3" exitCode=0 Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.326239 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-wmlqd" event={"ID":"e56d720e-4b84-4766-beec-f450aba33d28","Type":"ContainerDied","Data":"5fa7a1547603e7ed2d2ce6d76a593f2aa91a220ad620908e5a2956fe8c5f5cf3"} Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.334036 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-dhpb9"] Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.360367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pfgr6" event={"ID":"379ff25e-6c0a-45d4-a478-87a5e136aa47","Type":"ContainerStarted","Data":"b864647b670415dbf3392e795c29ad9f260f453409badad4631dd08ed392c99d"} Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.378924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tjtqm" event={"ID":"e5416678-c001-4966-a019-56f29f29adc7","Type":"ContainerStarted","Data":"17283b7189e102f3b6d4e1d0c38f8efbc1a3262f99564a708cfa2405f0063345"} Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.382002 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.388403 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79hcn" event={"ID":"a1876da4-5117-4c39-ab2d-51f877341c43","Type":"ContainerStarted","Data":"88cf61d13abf105698b1cc60922b522f27c36a821f7ee2adcc8b4002d05a4c71"} Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.409628 4792 generic.go:334] "Generic (PLEG): container finished" podID="9695440d-ffce-4276-ac24-83557d237c18" containerID="6c49758f4c1b773f01aeb16cffd4ddf51ec60072dc4d10385330fb314803e60d" exitCode=0 Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.409894 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.410134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jb5s7" event={"ID":"9695440d-ffce-4276-ac24-83557d237c18","Type":"ContainerDied","Data":"6c49758f4c1b773f01aeb16cffd4ddf51ec60072dc4d10385330fb314803e60d"} Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.410257 4792 scope.go:117] "RemoveContainer" containerID="6c49758f4c1b773f01aeb16cffd4ddf51ec60072dc4d10385330fb314803e60d" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.422727 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.465167 4792 scope.go:117] "RemoveContainer" containerID="aae09ff38531e8da73a2e2ae05e95bb479d4e2180a3c41758911ba4493b8b76d" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.490372 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-tjtqm" podStartSLOduration=3.490357853 podStartE2EDuration="3.490357853s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:52.40786356 +0000 UTC m=+1421.277192507" watchObservedRunningTime="2026-03-18 15:57:52.490357853 +0000 UTC m=+1421.359686790" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.510473 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-config\") pod \"9695440d-ffce-4276-ac24-83557d237c18\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.510538 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-sb\") pod \"9695440d-ffce-4276-ac24-83557d237c18\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.510701 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-svc\") pod \"9695440d-ffce-4276-ac24-83557d237c18\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.510766 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68k65\" (UniqueName: \"kubernetes.io/projected/9695440d-ffce-4276-ac24-83557d237c18-kube-api-access-68k65\") pod \"9695440d-ffce-4276-ac24-83557d237c18\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.510811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-swift-storage-0\") pod \"9695440d-ffce-4276-ac24-83557d237c18\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.510847 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-nb\") pod \"9695440d-ffce-4276-ac24-83557d237c18\" (UID: \"9695440d-ffce-4276-ac24-83557d237c18\") " Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.531545 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-79hcn" podStartSLOduration=3.5315248070000003 podStartE2EDuration="3.531524807s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:52.426787483 +0000 UTC m=+1421.296116420" watchObservedRunningTime="2026-03-18 15:57:52.531524807 +0000 UTC m=+1421.400853744" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.540618 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9695440d-ffce-4276-ac24-83557d237c18-kube-api-access-68k65" (OuterVolumeSpecName: "kube-api-access-68k65") pod "9695440d-ffce-4276-ac24-83557d237c18" (UID: "9695440d-ffce-4276-ac24-83557d237c18"). InnerVolumeSpecName "kube-api-access-68k65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.617909 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68k65\" (UniqueName: \"kubernetes.io/projected/9695440d-ffce-4276-ac24-83557d237c18-kube-api-access-68k65\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.902143 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9695440d-ffce-4276-ac24-83557d237c18" (UID: "9695440d-ffce-4276-ac24-83557d237c18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.940178 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.961677 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9695440d-ffce-4276-ac24-83557d237c18" (UID: "9695440d-ffce-4276-ac24-83557d237c18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:52 crc kubenswrapper[4792]: I0318 15:57:52.998269 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9695440d-ffce-4276-ac24-83557d237c18" (UID: "9695440d-ffce-4276-ac24-83557d237c18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.014214 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-config" (OuterVolumeSpecName: "config") pod "9695440d-ffce-4276-ac24-83557d237c18" (UID: "9695440d-ffce-4276-ac24-83557d237c18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.045587 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.045630 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.045641 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.210447 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9695440d-ffce-4276-ac24-83557d237c18" (UID: "9695440d-ffce-4276-ac24-83557d237c18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.316292 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9695440d-ffce-4276-ac24-83557d237c18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.400175 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jb5s7"] Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.428962 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jb5s7"] Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.440411 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.460943 4792 generic.go:334] "Generic (PLEG): container finished" podID="ff35f858-03ee-44eb-8333-2f426efa6281" containerID="4ec3145a609be8415bcaa05415461b41b58066f8b531dd9dbd4c32471649b088" exitCode=0 Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.461027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" event={"ID":"ff35f858-03ee-44eb-8333-2f426efa6281","Type":"ContainerDied","Data":"4ec3145a609be8415bcaa05415461b41b58066f8b531dd9dbd4c32471649b088"} Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.461054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" event={"ID":"ff35f858-03ee-44eb-8333-2f426efa6281","Type":"ContainerStarted","Data":"54e965186b30467eda031c1a615ed2387546ed0a0565a4ec76d4254d4e09e695"} Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.483933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ncpfm" event={"ID":"6967171c-e427-4723-ae1e-25e3bad61d59","Type":"ContainerStarted","Data":"c002043694cb4ff2d58d14c6dcc32093e1ce955dd9ecf417032b4dd31e4ba21c"} Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.484186 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.522499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tjtqm" event={"ID":"e5416678-c001-4966-a019-56f29f29adc7","Type":"ContainerStarted","Data":"1babca15697a2cd9b118a6bc1af2ffccc9c677253ff1134da2c62b4b40f3eea3"} Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.522526 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjmh\" (UniqueName: \"kubernetes.io/projected/e56d720e-4b84-4766-beec-f450aba33d28-kube-api-access-ktjmh\") pod \"e56d720e-4b84-4766-beec-f450aba33d28\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.523236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-nb\") pod \"e56d720e-4b84-4766-beec-f450aba33d28\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.525156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-svc\") pod \"e56d720e-4b84-4766-beec-f450aba33d28\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.525486 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-config\") pod \"e56d720e-4b84-4766-beec-f450aba33d28\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.525712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-swift-storage-0\") pod \"e56d720e-4b84-4766-beec-f450aba33d28\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.525739 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-sb\") pod \"e56d720e-4b84-4766-beec-f450aba33d28\" (UID: \"e56d720e-4b84-4766-beec-f450aba33d28\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.530602 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56d720e-4b84-4766-beec-f450aba33d28-kube-api-access-ktjmh" (OuterVolumeSpecName: "kube-api-access-ktjmh") pod "e56d720e-4b84-4766-beec-f450aba33d28" (UID: "e56d720e-4b84-4766-beec-f450aba33d28"). InnerVolumeSpecName "kube-api-access-ktjmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.552039 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jjwfp" event={"ID":"6a148250-156e-4b40-969e-3569cea8a403","Type":"ContainerStarted","Data":"3591a0ece444e8fb03717aad4a7d5e02539ada35b8350586d7378bc10ed3b98b"} Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.556002 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerStarted","Data":"2bbf0f2ed1e9202e0a37c40e5ad513d0305fabbcc3f333d50796855e3fa1439b"} Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.567630 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e56d720e-4b84-4766-beec-f450aba33d28" (UID: "e56d720e-4b84-4766-beec-f450aba33d28"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.583656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e56d720e-4b84-4766-beec-f450aba33d28" (UID: "e56d720e-4b84-4766-beec-f450aba33d28"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.584856 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e56d720e-4b84-4766-beec-f450aba33d28" (UID: "e56d720e-4b84-4766-beec-f450aba33d28"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.610685 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e56d720e-4b84-4766-beec-f450aba33d28" (UID: "e56d720e-4b84-4766-beec-f450aba33d28"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.627247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-config" (OuterVolumeSpecName: "config") pod "e56d720e-4b84-4766-beec-f450aba33d28" (UID: "e56d720e-4b84-4766-beec-f450aba33d28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.629288 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.629327 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.629341 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.629353 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjmh\" (UniqueName: \"kubernetes.io/projected/e56d720e-4b84-4766-beec-f450aba33d28-kube-api-access-ktjmh\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.629365 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.629379 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56d720e-4b84-4766-beec-f450aba33d28-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.741524 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.833816 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84be4145-eccb-4329-a407-0ecb688a6b20-operator-scripts\") pod \"84be4145-eccb-4329-a407-0ecb688a6b20\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.833927 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27mwq\" (UniqueName: \"kubernetes.io/projected/84be4145-eccb-4329-a407-0ecb688a6b20-kube-api-access-27mwq\") pod \"84be4145-eccb-4329-a407-0ecb688a6b20\" (UID: \"84be4145-eccb-4329-a407-0ecb688a6b20\") " Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.838455 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84be4145-eccb-4329-a407-0ecb688a6b20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84be4145-eccb-4329-a407-0ecb688a6b20" (UID: "84be4145-eccb-4329-a407-0ecb688a6b20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.870180 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84be4145-eccb-4329-a407-0ecb688a6b20-kube-api-access-27mwq" (OuterVolumeSpecName: "kube-api-access-27mwq") pod "84be4145-eccb-4329-a407-0ecb688a6b20" (UID: "84be4145-eccb-4329-a407-0ecb688a6b20"). InnerVolumeSpecName "kube-api-access-27mwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.913638 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9695440d-ffce-4276-ac24-83557d237c18" path="/var/lib/kubelet/pods/9695440d-ffce-4276-ac24-83557d237c18/volumes" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.937255 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84be4145-eccb-4329-a407-0ecb688a6b20-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:53 crc kubenswrapper[4792]: I0318 15:57:53.937296 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27mwq\" (UniqueName: \"kubernetes.io/projected/84be4145-eccb-4329-a407-0ecb688a6b20-kube-api-access-27mwq\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.587404 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9zvzr" Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.587424 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9zvzr" event={"ID":"84be4145-eccb-4329-a407-0ecb688a6b20","Type":"ContainerDied","Data":"05a0959eb5176e7c63585d2b1e47c46d9109e4f969e1dbbc73e0587709afa334"} Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.587942 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05a0959eb5176e7c63585d2b1e47c46d9109e4f969e1dbbc73e0587709afa334" Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.592162 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" event={"ID":"ff35f858-03ee-44eb-8333-2f426efa6281","Type":"ContainerStarted","Data":"d69bf31ecbcf4ba50c202b35804067622a79483ed926ca7b8f789d47a973bb48"} Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.592702 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.594753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-wmlqd" event={"ID":"e56d720e-4b84-4766-beec-f450aba33d28","Type":"ContainerDied","Data":"15cdc7a615feab424a530e40c92a31e1816edf0e6072f5350110e627d8cf5722"} Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.594809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-wmlqd" Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.594828 4792 scope.go:117] "RemoveContainer" containerID="5fa7a1547603e7ed2d2ce6d76a593f2aa91a220ad620908e5a2956fe8c5f5cf3" Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.642877 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" podStartSLOduration=5.642851717 podStartE2EDuration="5.642851717s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:57:54.616312447 +0000 UTC m=+1423.485641384" watchObservedRunningTime="2026-03-18 15:57:54.642851717 +0000 UTC m=+1423.512180644" Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.717754 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-wmlqd"] Mar 18 15:57:54 crc kubenswrapper[4792]: I0318 15:57:54.758857 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-wmlqd"] Mar 18 15:57:55 crc kubenswrapper[4792]: I0318 15:57:55.879083 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e56d720e-4b84-4766-beec-f450aba33d28" path="/var/lib/kubelet/pods/e56d720e-4b84-4766-beec-f450aba33d28/volumes" Mar 18 15:57:56 crc kubenswrapper[4792]: I0318 15:57:56.627450 4792 generic.go:334] "Generic (PLEG): container finished" podID="bc4c925f-3e43-4351-9703-7fdd44a1a9d6" containerID="938cb6b02e3b47a9aba8104a14d29e0e6a3726f4126e5dd07b212afdf79a9deb" exitCode=0 Mar 18 15:57:56 crc kubenswrapper[4792]: I0318 15:57:56.627757 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mqtz6" event={"ID":"bc4c925f-3e43-4351-9703-7fdd44a1a9d6","Type":"ContainerDied","Data":"938cb6b02e3b47a9aba8104a14d29e0e6a3726f4126e5dd07b212afdf79a9deb"} Mar 18 15:57:56 crc kubenswrapper[4792]: I0318 15:57:56.636063 4792 generic.go:334] "Generic (PLEG): container finished" podID="a1876da4-5117-4c39-ab2d-51f877341c43" containerID="88cf61d13abf105698b1cc60922b522f27c36a821f7ee2adcc8b4002d05a4c71" exitCode=0 Mar 18 15:57:56 crc kubenswrapper[4792]: I0318 15:57:56.636106 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79hcn" event={"ID":"a1876da4-5117-4c39-ab2d-51f877341c43","Type":"ContainerDied","Data":"88cf61d13abf105698b1cc60922b522f27c36a821f7ee2adcc8b4002d05a4c71"} Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.208262 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.218185 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.498149 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.569004 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mh7r\" (UniqueName: \"kubernetes.io/projected/a1876da4-5117-4c39-ab2d-51f877341c43-kube-api-access-7mh7r\") pod \"a1876da4-5117-4c39-ab2d-51f877341c43\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.569085 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-scripts\") pod \"a1876da4-5117-4c39-ab2d-51f877341c43\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.569135 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-credential-keys\") pod \"a1876da4-5117-4c39-ab2d-51f877341c43\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.569265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-combined-ca-bundle\") pod \"a1876da4-5117-4c39-ab2d-51f877341c43\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.569345 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-config-data\") pod \"a1876da4-5117-4c39-ab2d-51f877341c43\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.569522 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-fernet-keys\") pod \"a1876da4-5117-4c39-ab2d-51f877341c43\" (UID: \"a1876da4-5117-4c39-ab2d-51f877341c43\") " Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.577835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a1876da4-5117-4c39-ab2d-51f877341c43" (UID: "a1876da4-5117-4c39-ab2d-51f877341c43"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.577874 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1876da4-5117-4c39-ab2d-51f877341c43-kube-api-access-7mh7r" (OuterVolumeSpecName: "kube-api-access-7mh7r") pod "a1876da4-5117-4c39-ab2d-51f877341c43" (UID: "a1876da4-5117-4c39-ab2d-51f877341c43"). InnerVolumeSpecName "kube-api-access-7mh7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.578173 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-scripts" (OuterVolumeSpecName: "scripts") pod "a1876da4-5117-4c39-ab2d-51f877341c43" (UID: "a1876da4-5117-4c39-ab2d-51f877341c43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.578835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a1876da4-5117-4c39-ab2d-51f877341c43" (UID: "a1876da4-5117-4c39-ab2d-51f877341c43"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.608062 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1876da4-5117-4c39-ab2d-51f877341c43" (UID: "a1876da4-5117-4c39-ab2d-51f877341c43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.621760 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-config-data" (OuterVolumeSpecName: "config-data") pod "a1876da4-5117-4c39-ab2d-51f877341c43" (UID: "a1876da4-5117-4c39-ab2d-51f877341c43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.671241 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79hcn" event={"ID":"a1876da4-5117-4c39-ab2d-51f877341c43","Type":"ContainerDied","Data":"0fabac6215278783fdd76b159edb98d6ca7e8e1543b6c43df5924c3e9135ba58"} Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.671306 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fabac6215278783fdd76b159edb98d6ca7e8e1543b6c43df5924c3e9135ba58" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.674173 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.674210 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.674223 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.674235 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.674253 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mh7r\" (UniqueName: \"kubernetes.io/projected/a1876da4-5117-4c39-ab2d-51f877341c43-kube-api-access-7mh7r\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.674264 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1876da4-5117-4c39-ab2d-51f877341c43-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.674787 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79hcn" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.683649 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.770908 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-79hcn"] Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.786604 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-79hcn"] Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.856998 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nzh9d"] Mar 18 15:57:58 crc kubenswrapper[4792]: E0318 15:57:58.857609 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84be4145-eccb-4329-a407-0ecb688a6b20" containerName="mariadb-account-create-update" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.857633 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="84be4145-eccb-4329-a407-0ecb688a6b20" containerName="mariadb-account-create-update" Mar 18 15:57:58 crc kubenswrapper[4792]: E0318 15:57:58.857650 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9695440d-ffce-4276-ac24-83557d237c18" containerName="init" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.857660 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9695440d-ffce-4276-ac24-83557d237c18" containerName="init" Mar 18 15:57:58 crc kubenswrapper[4792]: E0318 15:57:58.857692 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56d720e-4b84-4766-beec-f450aba33d28" containerName="init" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.857701 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56d720e-4b84-4766-beec-f450aba33d28" containerName="init" Mar 18 15:57:58 crc kubenswrapper[4792]: E0318 15:57:58.857718 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9695440d-ffce-4276-ac24-83557d237c18" containerName="dnsmasq-dns" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.857726 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9695440d-ffce-4276-ac24-83557d237c18" containerName="dnsmasq-dns" Mar 18 15:57:58 crc kubenswrapper[4792]: E0318 15:57:58.857755 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1876da4-5117-4c39-ab2d-51f877341c43" containerName="keystone-bootstrap" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.857764 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1876da4-5117-4c39-ab2d-51f877341c43" containerName="keystone-bootstrap" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.858038 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="84be4145-eccb-4329-a407-0ecb688a6b20" containerName="mariadb-account-create-update" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.858060 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56d720e-4b84-4766-beec-f450aba33d28" containerName="init" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.858076 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9695440d-ffce-4276-ac24-83557d237c18" containerName="dnsmasq-dns" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.858100 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1876da4-5117-4c39-ab2d-51f877341c43" containerName="keystone-bootstrap" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.859108 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.861875 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.862569 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.862772 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.864037 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-b84g6" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.864409 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.867549 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nzh9d"] Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.881504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-credential-keys\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.881558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-scripts\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.881620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbfg\" (UniqueName: \"kubernetes.io/projected/46820dcb-64bf-40d0-ba36-827e2937de58-kube-api-access-5mbfg\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.881666 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-combined-ca-bundle\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.881691 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-fernet-keys\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.881805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-config-data\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.984035 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-credential-keys\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.984101 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-scripts\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.984159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbfg\" (UniqueName: \"kubernetes.io/projected/46820dcb-64bf-40d0-ba36-827e2937de58-kube-api-access-5mbfg\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.984209 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-combined-ca-bundle\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.984229 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-fernet-keys\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.984269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-config-data\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.988770 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-combined-ca-bundle\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.988944 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-scripts\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.989008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-credential-keys\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:58 crc kubenswrapper[4792]: I0318 15:57:58.992991 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-config-data\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:59 crc kubenswrapper[4792]: I0318 15:57:59.002614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-fernet-keys\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:59 crc kubenswrapper[4792]: I0318 15:57:59.007917 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbfg\" (UniqueName: \"kubernetes.io/projected/46820dcb-64bf-40d0-ba36-827e2937de58-kube-api-access-5mbfg\") pod \"keystone-bootstrap-nzh9d\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:59 crc kubenswrapper[4792]: I0318 15:57:59.194155 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:57:59 crc kubenswrapper[4792]: I0318 15:57:59.870690 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1876da4-5117-4c39-ab2d-51f877341c43" path="/var/lib/kubelet/pods/a1876da4-5117-4c39-ab2d-51f877341c43/volumes" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.143778 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564158-5fcfc"] Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.145853 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-5fcfc" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.149802 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.150210 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.150520 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.167574 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-5fcfc"] Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.223811 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcchl\" (UniqueName: \"kubernetes.io/projected/d587f36a-ade4-499c-b160-673d58efb861-kube-api-access-fcchl\") pod \"auto-csr-approver-29564158-5fcfc\" (UID: \"d587f36a-ade4-499c-b160-673d58efb861\") " pod="openshift-infra/auto-csr-approver-29564158-5fcfc" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.327076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcchl\" (UniqueName: \"kubernetes.io/projected/d587f36a-ade4-499c-b160-673d58efb861-kube-api-access-fcchl\") pod \"auto-csr-approver-29564158-5fcfc\" (UID: \"d587f36a-ade4-499c-b160-673d58efb861\") " pod="openshift-infra/auto-csr-approver-29564158-5fcfc" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.349256 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcchl\" (UniqueName: \"kubernetes.io/projected/d587f36a-ade4-499c-b160-673d58efb861-kube-api-access-fcchl\") pod \"auto-csr-approver-29564158-5fcfc\" (UID: \"d587f36a-ade4-499c-b160-673d58efb861\") " pod="openshift-infra/auto-csr-approver-29564158-5fcfc" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.464247 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.477532 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-5fcfc" Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.534112 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-fdrj5"] Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.534359 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" containerID="cri-o://86f3d861e016ac7a7173ab172f7cd23a4c1117667a6d366e1f4e198fa206b9fc" gracePeriod=10 Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.699098 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerID="86f3d861e016ac7a7173ab172f7cd23a4c1117667a6d366e1f4e198fa206b9fc" exitCode=0 Mar 18 15:58:00 crc kubenswrapper[4792]: I0318 15:58:00.699158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" event={"ID":"fa8c1204-320e-41c5-8393-c13f50febe7e","Type":"ContainerDied","Data":"86f3d861e016ac7a7173ab172f7cd23a4c1117667a6d366e1f4e198fa206b9fc"} Mar 18 15:58:02 crc kubenswrapper[4792]: I0318 15:58:02.397098 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: connect: connection refused" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.396621 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: connect: connection refused" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.527962 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mqtz6" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.611953 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-combined-ca-bundle\") pod \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.612070 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnkc6\" (UniqueName: \"kubernetes.io/projected/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-kube-api-access-gnkc6\") pod \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.612127 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-config-data\") pod \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.612244 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-db-sync-config-data\") pod \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\" (UID: \"bc4c925f-3e43-4351-9703-7fdd44a1a9d6\") " Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.620034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bc4c925f-3e43-4351-9703-7fdd44a1a9d6" (UID: "bc4c925f-3e43-4351-9703-7fdd44a1a9d6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.620194 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-kube-api-access-gnkc6" (OuterVolumeSpecName: "kube-api-access-gnkc6") pod "bc4c925f-3e43-4351-9703-7fdd44a1a9d6" (UID: "bc4c925f-3e43-4351-9703-7fdd44a1a9d6"). InnerVolumeSpecName "kube-api-access-gnkc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.654419 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc4c925f-3e43-4351-9703-7fdd44a1a9d6" (UID: "bc4c925f-3e43-4351-9703-7fdd44a1a9d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.706004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-config-data" (OuterVolumeSpecName: "config-data") pod "bc4c925f-3e43-4351-9703-7fdd44a1a9d6" (UID: "bc4c925f-3e43-4351-9703-7fdd44a1a9d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.715357 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnkc6\" (UniqueName: \"kubernetes.io/projected/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-kube-api-access-gnkc6\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.715396 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.715407 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.715415 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4c925f-3e43-4351-9703-7fdd44a1a9d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.783031 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mqtz6" event={"ID":"bc4c925f-3e43-4351-9703-7fdd44a1a9d6","Type":"ContainerDied","Data":"d1ddb043f889189a4edf98c3e6cac2948dec5ae5f90dbd0c9bf52b6192251963"} Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.783556 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ddb043f889189a4edf98c3e6cac2948dec5ae5f90dbd0c9bf52b6192251963" Mar 18 15:58:07 crc kubenswrapper[4792]: I0318 15:58:07.783052 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mqtz6" Mar 18 15:58:07 crc kubenswrapper[4792]: E0318 15:58:07.934333 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 18 15:58:07 crc kubenswrapper[4792]: E0318 15:58:07.934508 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55ch577h7bh6h88hb8h5bdh689h7ch5ch579h54dh5cch659h567h96h68fh54fh56dh5d8h98h5ch5c4h55ch576hch5bch585h55fh55fh58ch5f6q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csr85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(bd1c490b-f876-4095-9ae8-8280d3ce02c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.035084 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-kr2zd"] Mar 18 15:58:09 crc kubenswrapper[4792]: E0318 15:58:09.036115 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4c925f-3e43-4351-9703-7fdd44a1a9d6" containerName="glance-db-sync" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.036135 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c925f-3e43-4351-9703-7fdd44a1a9d6" containerName="glance-db-sync" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.036423 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4c925f-3e43-4351-9703-7fdd44a1a9d6" containerName="glance-db-sync" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.037717 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.084883 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-kr2zd"] Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.160563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-config\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.160678 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.160753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.160795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.160875 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.160904 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6xc\" (UniqueName: \"kubernetes.io/projected/5bab4c01-139c-4d38-81c5-d0630b78c37e-kube-api-access-ff6xc\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.263150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.263223 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.263323 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.263352 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6xc\" (UniqueName: \"kubernetes.io/projected/5bab4c01-139c-4d38-81c5-d0630b78c37e-kube-api-access-ff6xc\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.263463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-config\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.263591 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.265131 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.265134 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.265686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-config\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.266125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.278265 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.300096 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6xc\" (UniqueName: \"kubernetes.io/projected/5bab4c01-139c-4d38-81c5-d0630b78c37e-kube-api-access-ff6xc\") pod \"dnsmasq-dns-56df8fb6b7-kr2zd\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:09 crc kubenswrapper[4792]: I0318 15:58:09.371988 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.044029 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.046564 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.049595 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.049958 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sw42f" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.050726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.061697 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.110897 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-logs\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.110956 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.111624 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.111674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.111730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9446\" (UniqueName: \"kubernetes.io/projected/5b2379d9-5953-4e5d-b099-56fbe190c47b-kube-api-access-q9446\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.111842 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.111889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.213506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-logs\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.213565 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.213639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.213693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.213754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9446\" (UniqueName: \"kubernetes.io/projected/5b2379d9-5953-4e5d-b099-56fbe190c47b-kube-api-access-q9446\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.213887 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.213942 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.214588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.214907 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-logs\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.222249 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.222293 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f4135ef5c687380422f9124ccce113815e08bdbecc9d37bdfa336b12e119b7ff/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.222522 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.224156 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.228249 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.230127 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.234123 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.234722 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.254915 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9446\" (UniqueName: \"kubernetes.io/projected/5b2379d9-5953-4e5d-b099-56fbe190c47b-kube-api-access-q9446\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.263600 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.317220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.317307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.317346 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.317413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbssb\" (UniqueName: \"kubernetes.io/projected/c95141f3-8c44-452b-90b6-cd67b5d42268-kube-api-access-hbssb\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.317463 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.317500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-logs\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.317642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.343600 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.391360 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.418993 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.419085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbssb\" (UniqueName: \"kubernetes.io/projected/c95141f3-8c44-452b-90b6-cd67b5d42268-kube-api-access-hbssb\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.419126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.419156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-logs\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.419264 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.419608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.419744 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.420666 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.423599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.424185 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.424414 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-logs\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.424727 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.424765 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7b0cea5cb737a527758633cd5e46883214d92b47b2dcb8de95269a535706d518/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.435774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.464532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbssb\" (UniqueName: \"kubernetes.io/projected/c95141f3-8c44-452b-90b6-cd67b5d42268-kube-api-access-hbssb\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.490071 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:10 crc kubenswrapper[4792]: I0318 15:58:10.698368 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:12 crc kubenswrapper[4792]: I0318 15:58:12.064901 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:12 crc kubenswrapper[4792]: I0318 15:58:12.147375 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:13 crc kubenswrapper[4792]: I0318 15:58:13.868364 4792 generic.go:334] "Generic (PLEG): container finished" podID="e5416678-c001-4966-a019-56f29f29adc7" containerID="1babca15697a2cd9b118a6bc1af2ffccc9c677253ff1134da2c62b4b40f3eea3" exitCode=0 Mar 18 15:58:13 crc kubenswrapper[4792]: I0318 15:58:13.872014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tjtqm" event={"ID":"e5416678-c001-4966-a019-56f29f29adc7","Type":"ContainerDied","Data":"1babca15697a2cd9b118a6bc1af2ffccc9c677253ff1134da2c62b4b40f3eea3"} Mar 18 15:58:17 crc kubenswrapper[4792]: I0318 15:58:17.397294 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: i/o timeout" Mar 18 15:58:17 crc kubenswrapper[4792]: I0318 15:58:17.398162 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.368666 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.369304 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qqdmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-pfgr6_openstack(379ff25e-6c0a-45d4-a478-87a5e136aa47): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.370526 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-pfgr6" podUID="379ff25e-6c0a-45d4-a478-87a5e136aa47" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.780746 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.781003 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4hmkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-k2l2s_openstack(2f4cf6d4-998f-445b-82ed-25b2b4670875): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.782180 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-k2l2s" podUID="2f4cf6d4-998f-445b-82ed-25b2b4670875" Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.912035 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.918656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.919362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tjtqm" event={"ID":"e5416678-c001-4966-a019-56f29f29adc7","Type":"ContainerDied","Data":"17283b7189e102f3b6d4e1d0c38f8efbc1a3262f99564a708cfa2405f0063345"} Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.919396 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17283b7189e102f3b6d4e1d0c38f8efbc1a3262f99564a708cfa2405f0063345" Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.922658 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.922912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" event={"ID":"fa8c1204-320e-41c5-8393-c13f50febe7e","Type":"ContainerDied","Data":"f3ba0cc2b9a066fe7a65566971942c7ef6a34c41be1a946df0ea4310d5c49c11"} Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.922942 4792 scope.go:117] "RemoveContainer" containerID="86f3d861e016ac7a7173ab172f7cd23a4c1117667a6d366e1f4e198fa206b9fc" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.924991 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-k2l2s" podUID="2f4cf6d4-998f-445b-82ed-25b2b4670875" Mar 18 15:58:18 crc kubenswrapper[4792]: E0318 15:58:18.926238 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-pfgr6" podUID="379ff25e-6c0a-45d4-a478-87a5e136aa47" Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.931353 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-sb\") pod \"fa8c1204-320e-41c5-8393-c13f50febe7e\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.931917 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-config\") pod \"fa8c1204-320e-41c5-8393-c13f50febe7e\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.932026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-nb\") pod \"fa8c1204-320e-41c5-8393-c13f50febe7e\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.932095 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4ddm\" (UniqueName: \"kubernetes.io/projected/e5416678-c001-4966-a019-56f29f29adc7-kube-api-access-w4ddm\") pod \"e5416678-c001-4966-a019-56f29f29adc7\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.932122 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-dns-svc\") pod \"fa8c1204-320e-41c5-8393-c13f50febe7e\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.932211 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-config\") pod \"e5416678-c001-4966-a019-56f29f29adc7\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.932313 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmxwm\" (UniqueName: \"kubernetes.io/projected/fa8c1204-320e-41c5-8393-c13f50febe7e-kube-api-access-tmxwm\") pod \"fa8c1204-320e-41c5-8393-c13f50febe7e\" (UID: \"fa8c1204-320e-41c5-8393-c13f50febe7e\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.932356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-combined-ca-bundle\") pod \"e5416678-c001-4966-a019-56f29f29adc7\" (UID: \"e5416678-c001-4966-a019-56f29f29adc7\") " Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.945688 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8c1204-320e-41c5-8393-c13f50febe7e-kube-api-access-tmxwm" (OuterVolumeSpecName: "kube-api-access-tmxwm") pod "fa8c1204-320e-41c5-8393-c13f50febe7e" (UID: "fa8c1204-320e-41c5-8393-c13f50febe7e"). InnerVolumeSpecName "kube-api-access-tmxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:18 crc kubenswrapper[4792]: I0318 15:58:18.947124 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5416678-c001-4966-a019-56f29f29adc7-kube-api-access-w4ddm" (OuterVolumeSpecName: "kube-api-access-w4ddm") pod "e5416678-c001-4966-a019-56f29f29adc7" (UID: "e5416678-c001-4966-a019-56f29f29adc7"). InnerVolumeSpecName "kube-api-access-w4ddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.014532 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-config" (OuterVolumeSpecName: "config") pod "e5416678-c001-4966-a019-56f29f29adc7" (UID: "e5416678-c001-4966-a019-56f29f29adc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.035745 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmxwm\" (UniqueName: \"kubernetes.io/projected/fa8c1204-320e-41c5-8393-c13f50febe7e-kube-api-access-tmxwm\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.035792 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4ddm\" (UniqueName: \"kubernetes.io/projected/e5416678-c001-4966-a019-56f29f29adc7-kube-api-access-w4ddm\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.035806 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.058762 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5416678-c001-4966-a019-56f29f29adc7" (UID: "e5416678-c001-4966-a019-56f29f29adc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.140588 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5416678-c001-4966-a019-56f29f29adc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.166612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa8c1204-320e-41c5-8393-c13f50febe7e" (UID: "fa8c1204-320e-41c5-8393-c13f50febe7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.167296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa8c1204-320e-41c5-8393-c13f50febe7e" (UID: "fa8c1204-320e-41c5-8393-c13f50febe7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.174646 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa8c1204-320e-41c5-8393-c13f50febe7e" (UID: "fa8c1204-320e-41c5-8393-c13f50febe7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.254132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-config" (OuterVolumeSpecName: "config") pod "fa8c1204-320e-41c5-8393-c13f50febe7e" (UID: "fa8c1204-320e-41c5-8393-c13f50febe7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.255002 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.255039 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.255054 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.255066 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa8c1204-320e-41c5-8393-c13f50febe7e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.567731 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-fdrj5"] Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.577428 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-fdrj5"] Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.869704 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" path="/var/lib/kubelet/pods/fa8c1204-320e-41c5-8393-c13f50febe7e/volumes" Mar 18 15:58:19 crc kubenswrapper[4792]: I0318 15:58:19.931491 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tjtqm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.219305 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-kr2zd"] Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.266526 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-5lpcm"] Mar 18 15:58:20 crc kubenswrapper[4792]: E0318 15:58:20.267368 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="init" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.267490 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="init" Mar 18 15:58:20 crc kubenswrapper[4792]: E0318 15:58:20.267576 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5416678-c001-4966-a019-56f29f29adc7" containerName="neutron-db-sync" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.267654 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5416678-c001-4966-a019-56f29f29adc7" containerName="neutron-db-sync" Mar 18 15:58:20 crc kubenswrapper[4792]: E0318 15:58:20.267753 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.267833 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.268196 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.268317 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5416678-c001-4966-a019-56f29f29adc7" containerName="neutron-db-sync" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.269817 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.280514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-config\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.280592 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.280650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxtqd\" (UniqueName: \"kubernetes.io/projected/314657e3-e425-4547-9bc0-34b6554bb0c9-kube-api-access-cxtqd\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.280715 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.280914 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-svc\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.281002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.286206 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-5lpcm"] Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.323505 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56b66ddf48-4mkl2"] Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.325946 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.334576 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dg784" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.334772 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.334903 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.335068 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.375649 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56b66ddf48-4mkl2"] Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.383184 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-svc\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.383731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-httpd-config\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.383838 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.383898 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-combined-ca-bundle\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.384100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-config\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.384238 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-config\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.384582 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.384668 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxtqd\" (UniqueName: \"kubernetes.io/projected/314657e3-e425-4547-9bc0-34b6554bb0c9-kube-api-access-cxtqd\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.384841 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.384925 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-ovndb-tls-certs\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.385227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7h6p\" (UniqueName: \"kubernetes.io/projected/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-kube-api-access-p7h6p\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.386339 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.386507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.386712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-svc\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.386888 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-config\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.386900 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.423252 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxtqd\" (UniqueName: \"kubernetes.io/projected/314657e3-e425-4547-9bc0-34b6554bb0c9-kube-api-access-cxtqd\") pod \"dnsmasq-dns-6b7b667979-5lpcm\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.487084 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-httpd-config\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.487136 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-combined-ca-bundle\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.487188 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-config\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.487417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-ovndb-tls-certs\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.487505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7h6p\" (UniqueName: \"kubernetes.io/projected/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-kube-api-access-p7h6p\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.494893 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-config\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.496727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-httpd-config\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.502278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-ovndb-tls-certs\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.508008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-combined-ca-bundle\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.508492 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7h6p\" (UniqueName: \"kubernetes.io/projected/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-kube-api-access-p7h6p\") pod \"neutron-56b66ddf48-4mkl2\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.613604 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:20 crc kubenswrapper[4792]: I0318 15:58:20.659867 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:20 crc kubenswrapper[4792]: E0318 15:58:20.935084 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 15:58:20 crc kubenswrapper[4792]: E0318 15:58:20.935601 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lbm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ncpfm_openstack(6967171c-e427-4723-ae1e-25e3bad61d59): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 15:58:20 crc kubenswrapper[4792]: E0318 15:58:20.937010 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ncpfm" podUID="6967171c-e427-4723-ae1e-25e3bad61d59" Mar 18 15:58:21 crc kubenswrapper[4792]: I0318 15:58:21.586768 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-5fcfc"] Mar 18 15:58:22 crc kubenswrapper[4792]: E0318 15:58:22.003616 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-ncpfm" podUID="6967171c-e427-4723-ae1e-25e3bad61d59" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.278812 4792 scope.go:117] "RemoveContainer" containerID="2bed1913e692f9d8e0ac2042a50544a1e56841327dc84a2607a5697896d758ab" Mar 18 15:58:22 crc kubenswrapper[4792]: W0318 15:58:22.330685 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd587f36a_ade4_499c_b160_673d58efb861.slice/crio-80289016c58758a6d69c7a4a0cc9cb56149b24ce707479c847107f71dedb4f31 WatchSource:0}: Error finding container 80289016c58758a6d69c7a4a0cc9cb56149b24ce707479c847107f71dedb4f31: Status 404 returned error can't find the container with id 80289016c58758a6d69c7a4a0cc9cb56149b24ce707479c847107f71dedb4f31 Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.401477 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-fdrj5" podUID="fa8c1204-320e-41c5-8393-c13f50febe7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: i/o timeout" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.842754 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76f8d6944f-p946j"] Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.861653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.864908 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.867603 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.880198 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f8d6944f-p946j"] Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.975177 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-public-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.975753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-internal-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.975802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-config\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.975884 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-ovndb-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.975928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-httpd-config\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.975957 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-combined-ca-bundle\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:22 crc kubenswrapper[4792]: I0318 15:58:22.976211 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-922sn\" (UniqueName: \"kubernetes.io/projected/84428496-0cfa-4794-baaa-65b6350ec310-kube-api-access-922sn\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.022504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-5fcfc" event={"ID":"d587f36a-ade4-499c-b160-673d58efb861","Type":"ContainerStarted","Data":"80289016c58758a6d69c7a4a0cc9cb56149b24ce707479c847107f71dedb4f31"} Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.026425 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jjwfp" event={"ID":"6a148250-156e-4b40-969e-3569cea8a403","Type":"ContainerStarted","Data":"59823f71f56a681e7eaa4b6e106c5069d39fdf7e592ab47bc1d91f1ef0778ed3"} Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.032514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerStarted","Data":"1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598"} Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.069072 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jjwfp" podStartSLOduration=7.51115839 podStartE2EDuration="34.0690491s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="2026-03-18 15:57:52.242559614 +0000 UTC m=+1421.111888551" lastFinishedPulling="2026-03-18 15:58:18.800450324 +0000 UTC m=+1447.669779261" observedRunningTime="2026-03-18 15:58:23.059637712 +0000 UTC m=+1451.928966649" watchObservedRunningTime="2026-03-18 15:58:23.0690491 +0000 UTC m=+1451.938378037" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.080348 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-922sn\" (UniqueName: \"kubernetes.io/projected/84428496-0cfa-4794-baaa-65b6350ec310-kube-api-access-922sn\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.080401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-public-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.080430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-internal-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.080463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-config\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.080509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-ovndb-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.080534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-httpd-config\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.080574 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-combined-ca-bundle\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.099474 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-public-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.100688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-httpd-config\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.101254 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-combined-ca-bundle\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.101361 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-config\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.111436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-ovndb-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.112119 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-internal-tls-certs\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.112185 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-922sn\" (UniqueName: \"kubernetes.io/projected/84428496-0cfa-4794-baaa-65b6350ec310-kube-api-access-922sn\") pod \"neutron-76f8d6944f-p946j\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.121182 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.209391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.474187 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nzh9d"] Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.503767 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-kr2zd"] Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.571962 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.596006 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-5lpcm"] Mar 18 15:58:23 crc kubenswrapper[4792]: W0318 15:58:23.601222 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bab4c01_139c_4d38_81c5_d0630b78c37e.slice/crio-9f98b33e67a1109bd3ba21e887867ff134c04a60172b771ad3000f416efa4af5 WatchSource:0}: Error finding container 9f98b33e67a1109bd3ba21e887867ff134c04a60172b771ad3000f416efa4af5: Status 404 returned error can't find the container with id 9f98b33e67a1109bd3ba21e887867ff134c04a60172b771ad3000f416efa4af5 Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.628391 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:23 crc kubenswrapper[4792]: W0318 15:58:23.684615 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod314657e3_e425_4547_9bc0_34b6554bb0c9.slice/crio-1854bed556e91c77e1512750729d15e99aaf606c82010fcca10c11ce1e775e8a WatchSource:0}: Error finding container 1854bed556e91c77e1512750729d15e99aaf606c82010fcca10c11ce1e775e8a: Status 404 returned error can't find the container with id 1854bed556e91c77e1512750729d15e99aaf606c82010fcca10c11ce1e775e8a Mar 18 15:58:23 crc kubenswrapper[4792]: W0318 15:58:23.703632 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc95141f3_8c44_452b_90b6_cd67b5d42268.slice/crio-59c43ef4a71932eda0c59b358ec3b7001ae40efb73058b621c989572a5dd322d WatchSource:0}: Error finding container 59c43ef4a71932eda0c59b358ec3b7001ae40efb73058b621c989572a5dd322d: Status 404 returned error can't find the container with id 59c43ef4a71932eda0c59b358ec3b7001ae40efb73058b621c989572a5dd322d Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.773879 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56b66ddf48-4mkl2"] Mar 18 15:58:23 crc kubenswrapper[4792]: W0318 15:58:23.817247 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d8ca8f_4d78_40ba_b62c_7329c59cb7c5.slice/crio-c3e53a3caa1d5f681892bf432c47aba0ba48cddccef54c71bca02e9a811b1912 WatchSource:0}: Error finding container c3e53a3caa1d5f681892bf432c47aba0ba48cddccef54c71bca02e9a811b1912: Status 404 returned error can't find the container with id c3e53a3caa1d5f681892bf432c47aba0ba48cddccef54c71bca02e9a811b1912 Mar 18 15:58:23 crc kubenswrapper[4792]: I0318 15:58:23.998665 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f8d6944f-p946j"] Mar 18 15:58:24 crc kubenswrapper[4792]: I0318 15:58:24.057187 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b2379d9-5953-4e5d-b099-56fbe190c47b","Type":"ContainerStarted","Data":"a8c4706e66b76063d854f9aea30a67c0b194925ddcb721ae4a5ebe24d74353ed"} Mar 18 15:58:24 crc kubenswrapper[4792]: I0318 15:58:24.131852 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b66ddf48-4mkl2" event={"ID":"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5","Type":"ContainerStarted","Data":"c3e53a3caa1d5f681892bf432c47aba0ba48cddccef54c71bca02e9a811b1912"} Mar 18 15:58:24 crc kubenswrapper[4792]: W0318 15:58:24.136929 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84428496_0cfa_4794_baaa_65b6350ec310.slice/crio-dc2e566436b429754d43058be1569b413074b3231c8df5633ed98c8a4f08c19e WatchSource:0}: Error finding container dc2e566436b429754d43058be1569b413074b3231c8df5633ed98c8a4f08c19e: Status 404 returned error can't find the container with id dc2e566436b429754d43058be1569b413074b3231c8df5633ed98c8a4f08c19e Mar 18 15:58:24 crc kubenswrapper[4792]: I0318 15:58:24.137807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nzh9d" event={"ID":"46820dcb-64bf-40d0-ba36-827e2937de58","Type":"ContainerStarted","Data":"a8c773b849431af1a623329a2ae174b3ba5bf04b93e36c85c880f70fa947a233"} Mar 18 15:58:24 crc kubenswrapper[4792]: I0318 15:58:24.168989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" event={"ID":"5bab4c01-139c-4d38-81c5-d0630b78c37e","Type":"ContainerStarted","Data":"9f98b33e67a1109bd3ba21e887867ff134c04a60172b771ad3000f416efa4af5"} Mar 18 15:58:24 crc kubenswrapper[4792]: I0318 15:58:24.184360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c95141f3-8c44-452b-90b6-cd67b5d42268","Type":"ContainerStarted","Data":"59c43ef4a71932eda0c59b358ec3b7001ae40efb73058b621c989572a5dd322d"} Mar 18 15:58:24 crc kubenswrapper[4792]: I0318 15:58:24.202247 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" event={"ID":"314657e3-e425-4547-9bc0-34b6554bb0c9","Type":"ContainerStarted","Data":"1854bed556e91c77e1512750729d15e99aaf606c82010fcca10c11ce1e775e8a"} Mar 18 15:58:25 crc kubenswrapper[4792]: I0318 15:58:25.216213 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b2379d9-5953-4e5d-b099-56fbe190c47b","Type":"ContainerStarted","Data":"ebcad65ca1f0af0ffda3ee08892e64aed0d92dc31f70a297586f8e0a8034cd0a"} Mar 18 15:58:25 crc kubenswrapper[4792]: I0318 15:58:25.218797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nzh9d" event={"ID":"46820dcb-64bf-40d0-ba36-827e2937de58","Type":"ContainerStarted","Data":"5524ac9f88d877ff2d5e17e2864d7fa1f95f89b4b6b117790ee5a210a64d8cc5"} Mar 18 15:58:25 crc kubenswrapper[4792]: I0318 15:58:25.221198 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" event={"ID":"314657e3-e425-4547-9bc0-34b6554bb0c9","Type":"ContainerStarted","Data":"6f5e342bf67017c442a9bf121defb671b61b4e3a7126f9089a7c591ad96caddd"} Mar 18 15:58:25 crc kubenswrapper[4792]: I0318 15:58:25.223255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8d6944f-p946j" event={"ID":"84428496-0cfa-4794-baaa-65b6350ec310","Type":"ContainerStarted","Data":"dc2e566436b429754d43058be1569b413074b3231c8df5633ed98c8a4f08c19e"} Mar 18 15:58:25 crc kubenswrapper[4792]: I0318 15:58:25.249231 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nzh9d" podStartSLOduration=27.249209089 podStartE2EDuration="27.249209089s" podCreationTimestamp="2026-03-18 15:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:25.235747563 +0000 UTC m=+1454.105076500" watchObservedRunningTime="2026-03-18 15:58:25.249209089 +0000 UTC m=+1454.118538026" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.251280 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8d6944f-p946j" event={"ID":"84428496-0cfa-4794-baaa-65b6350ec310","Type":"ContainerStarted","Data":"e20d47afd4e19084054ae7e9e5d525fe420653b2bb7cd732cee5b99c236ea119"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.251858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8d6944f-p946j" event={"ID":"84428496-0cfa-4794-baaa-65b6350ec310","Type":"ContainerStarted","Data":"5d74754be1babe43267f5a09ee47baddd1161593a188bc0c08032c6fe04290c7"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.251926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.254224 4792 generic.go:334] "Generic (PLEG): container finished" podID="d587f36a-ade4-499c-b160-673d58efb861" containerID="3ef6bfddcc01f150b84d703a422b8347269df5148d2c18e20cda9082dac149c5" exitCode=0 Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.254275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-5fcfc" event={"ID":"d587f36a-ade4-499c-b160-673d58efb861","Type":"ContainerDied","Data":"3ef6bfddcc01f150b84d703a422b8347269df5148d2c18e20cda9082dac149c5"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.258111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b66ddf48-4mkl2" event={"ID":"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5","Type":"ContainerStarted","Data":"e83dd235cb7655cc32dc1b455f73a87297d7a65837faf67dec7c6873f58184f1"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.258166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b66ddf48-4mkl2" event={"ID":"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5","Type":"ContainerStarted","Data":"a3fc31e388b6c2a17bc5a418f4aee15bc39270924b3f2af99fdb44f8479a617e"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.258296 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.261172 4792 generic.go:334] "Generic (PLEG): container finished" podID="5bab4c01-139c-4d38-81c5-d0630b78c37e" containerID="39269cd75cdbf8bdaddd5c0585441c25e6f70d4203507b9c562dcacc1f6064ca" exitCode=0 Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.261252 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" event={"ID":"5bab4c01-139c-4d38-81c5-d0630b78c37e","Type":"ContainerDied","Data":"39269cd75cdbf8bdaddd5c0585441c25e6f70d4203507b9c562dcacc1f6064ca"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.297369 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76f8d6944f-p946j" podStartSLOduration=4.297345822 podStartE2EDuration="4.297345822s" podCreationTimestamp="2026-03-18 15:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:26.277545615 +0000 UTC m=+1455.146874552" watchObservedRunningTime="2026-03-18 15:58:26.297345822 +0000 UTC m=+1455.166674759" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.300295 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c95141f3-8c44-452b-90b6-cd67b5d42268","Type":"ContainerStarted","Data":"f8c281769cfc18dd784e8fa9fbbc03a10ef8ed0a8040e7c26bdd334408d7d76c"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.317072 4792 generic.go:334] "Generic (PLEG): container finished" podID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerID="6f5e342bf67017c442a9bf121defb671b61b4e3a7126f9089a7c591ad96caddd" exitCode=0 Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.318815 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" event={"ID":"314657e3-e425-4547-9bc0-34b6554bb0c9","Type":"ContainerDied","Data":"6f5e342bf67017c442a9bf121defb671b61b4e3a7126f9089a7c591ad96caddd"} Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.346466 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56b66ddf48-4mkl2" podStartSLOduration=6.346434025 podStartE2EDuration="6.346434025s" podCreationTimestamp="2026-03-18 15:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:26.334997323 +0000 UTC m=+1455.204326280" watchObservedRunningTime="2026-03-18 15:58:26.346434025 +0000 UTC m=+1455.215762972" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.875727 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.951257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-config\") pod \"5bab4c01-139c-4d38-81c5-d0630b78c37e\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.951388 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-svc\") pod \"5bab4c01-139c-4d38-81c5-d0630b78c37e\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.951416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-sb\") pod \"5bab4c01-139c-4d38-81c5-d0630b78c37e\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.951510 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff6xc\" (UniqueName: \"kubernetes.io/projected/5bab4c01-139c-4d38-81c5-d0630b78c37e-kube-api-access-ff6xc\") pod \"5bab4c01-139c-4d38-81c5-d0630b78c37e\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.951595 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-swift-storage-0\") pod \"5bab4c01-139c-4d38-81c5-d0630b78c37e\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.951730 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-nb\") pod \"5bab4c01-139c-4d38-81c5-d0630b78c37e\" (UID: \"5bab4c01-139c-4d38-81c5-d0630b78c37e\") " Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.959659 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bab4c01-139c-4d38-81c5-d0630b78c37e-kube-api-access-ff6xc" (OuterVolumeSpecName: "kube-api-access-ff6xc") pod "5bab4c01-139c-4d38-81c5-d0630b78c37e" (UID: "5bab4c01-139c-4d38-81c5-d0630b78c37e"). InnerVolumeSpecName "kube-api-access-ff6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.989057 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5bab4c01-139c-4d38-81c5-d0630b78c37e" (UID: "5bab4c01-139c-4d38-81c5-d0630b78c37e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:26 crc kubenswrapper[4792]: I0318 15:58:26.989609 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bab4c01-139c-4d38-81c5-d0630b78c37e" (UID: "5bab4c01-139c-4d38-81c5-d0630b78c37e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.000462 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bab4c01-139c-4d38-81c5-d0630b78c37e" (UID: "5bab4c01-139c-4d38-81c5-d0630b78c37e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.009486 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-config" (OuterVolumeSpecName: "config") pod "5bab4c01-139c-4d38-81c5-d0630b78c37e" (UID: "5bab4c01-139c-4d38-81c5-d0630b78c37e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.011500 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bab4c01-139c-4d38-81c5-d0630b78c37e" (UID: "5bab4c01-139c-4d38-81c5-d0630b78c37e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.064699 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.064736 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.064751 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff6xc\" (UniqueName: \"kubernetes.io/projected/5bab4c01-139c-4d38-81c5-d0630b78c37e-kube-api-access-ff6xc\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.064763 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.064773 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.064783 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bab4c01-139c-4d38-81c5-d0630b78c37e-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.336761 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b2379d9-5953-4e5d-b099-56fbe190c47b","Type":"ContainerStarted","Data":"728161e17aded932c5b433c2e1316faa5de60de2b6d2fcc16b54f11a1a5ce672"} Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.337100 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-log" containerID="cri-o://ebcad65ca1f0af0ffda3ee08892e64aed0d92dc31f70a297586f8e0a8034cd0a" gracePeriod=30 Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.337289 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-httpd" containerID="cri-o://728161e17aded932c5b433c2e1316faa5de60de2b6d2fcc16b54f11a1a5ce672" gracePeriod=30 Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.341352 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" event={"ID":"5bab4c01-139c-4d38-81c5-d0630b78c37e","Type":"ContainerDied","Data":"9f98b33e67a1109bd3ba21e887867ff134c04a60172b771ad3000f416efa4af5"} Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.341702 4792 scope.go:117] "RemoveContainer" containerID="39269cd75cdbf8bdaddd5c0585441c25e6f70d4203507b9c562dcacc1f6064ca" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.342687 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-kr2zd" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.347858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c95141f3-8c44-452b-90b6-cd67b5d42268","Type":"ContainerStarted","Data":"8d1f902cff3f9daacd502980d6fda8381bfaac22710a46bccc6424168acb2831"} Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.348430 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-log" containerID="cri-o://f8c281769cfc18dd784e8fa9fbbc03a10ef8ed0a8040e7c26bdd334408d7d76c" gracePeriod=30 Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.348652 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-httpd" containerID="cri-o://8d1f902cff3f9daacd502980d6fda8381bfaac22710a46bccc6424168acb2831" gracePeriod=30 Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.362344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" event={"ID":"314657e3-e425-4547-9bc0-34b6554bb0c9","Type":"ContainerStarted","Data":"7b33c3b1f993d9af75223d7aa0adaae51d47fce2167ad3a0b643b0af28fb7733"} Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.362827 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.386529 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.386505113 podStartE2EDuration="19.386505113s" podCreationTimestamp="2026-03-18 15:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:27.374757711 +0000 UTC m=+1456.244086648" watchObservedRunningTime="2026-03-18 15:58:27.386505113 +0000 UTC m=+1456.255834050" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.412911 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=18.412846065 podStartE2EDuration="18.412846065s" podCreationTimestamp="2026-03-18 15:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:27.400432963 +0000 UTC m=+1456.269761910" watchObservedRunningTime="2026-03-18 15:58:27.412846065 +0000 UTC m=+1456.282175012" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.442635 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" podStartSLOduration=7.442606678 podStartE2EDuration="7.442606678s" podCreationTimestamp="2026-03-18 15:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:27.424190424 +0000 UTC m=+1456.293519361" watchObservedRunningTime="2026-03-18 15:58:27.442606678 +0000 UTC m=+1456.311935615" Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.528576 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-kr2zd"] Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.548178 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-kr2zd"] Mar 18 15:58:27 crc kubenswrapper[4792]: I0318 15:58:27.943514 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bab4c01-139c-4d38-81c5-d0630b78c37e" path="/var/lib/kubelet/pods/5bab4c01-139c-4d38-81c5-d0630b78c37e/volumes" Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.378668 4792 generic.go:334] "Generic (PLEG): container finished" podID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerID="728161e17aded932c5b433c2e1316faa5de60de2b6d2fcc16b54f11a1a5ce672" exitCode=0 Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.378992 4792 generic.go:334] "Generic (PLEG): container finished" podID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerID="ebcad65ca1f0af0ffda3ee08892e64aed0d92dc31f70a297586f8e0a8034cd0a" exitCode=143 Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.378705 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b2379d9-5953-4e5d-b099-56fbe190c47b","Type":"ContainerDied","Data":"728161e17aded932c5b433c2e1316faa5de60de2b6d2fcc16b54f11a1a5ce672"} Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.379058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b2379d9-5953-4e5d-b099-56fbe190c47b","Type":"ContainerDied","Data":"ebcad65ca1f0af0ffda3ee08892e64aed0d92dc31f70a297586f8e0a8034cd0a"} Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.381513 4792 generic.go:334] "Generic (PLEG): container finished" podID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerID="8d1f902cff3f9daacd502980d6fda8381bfaac22710a46bccc6424168acb2831" exitCode=0 Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.381544 4792 generic.go:334] "Generic (PLEG): container finished" podID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerID="f8c281769cfc18dd784e8fa9fbbc03a10ef8ed0a8040e7c26bdd334408d7d76c" exitCode=143 Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.381605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c95141f3-8c44-452b-90b6-cd67b5d42268","Type":"ContainerDied","Data":"8d1f902cff3f9daacd502980d6fda8381bfaac22710a46bccc6424168acb2831"} Mar 18 15:58:28 crc kubenswrapper[4792]: I0318 15:58:28.381647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c95141f3-8c44-452b-90b6-cd67b5d42268","Type":"ContainerDied","Data":"f8c281769cfc18dd784e8fa9fbbc03a10ef8ed0a8040e7c26bdd334408d7d76c"} Mar 18 15:58:31 crc kubenswrapper[4792]: I0318 15:58:31.450042 4792 generic.go:334] "Generic (PLEG): container finished" podID="6a148250-156e-4b40-969e-3569cea8a403" containerID="59823f71f56a681e7eaa4b6e106c5069d39fdf7e592ab47bc1d91f1ef0778ed3" exitCode=0 Mar 18 15:58:31 crc kubenswrapper[4792]: I0318 15:58:31.450592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jjwfp" event={"ID":"6a148250-156e-4b40-969e-3569cea8a403","Type":"ContainerDied","Data":"59823f71f56a681e7eaa4b6e106c5069d39fdf7e592ab47bc1d91f1ef0778ed3"} Mar 18 15:58:32 crc kubenswrapper[4792]: I0318 15:58:32.462338 4792 generic.go:334] "Generic (PLEG): container finished" podID="46820dcb-64bf-40d0-ba36-827e2937de58" containerID="5524ac9f88d877ff2d5e17e2864d7fa1f95f89b4b6b117790ee5a210a64d8cc5" exitCode=0 Mar 18 15:58:32 crc kubenswrapper[4792]: I0318 15:58:32.462409 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nzh9d" event={"ID":"46820dcb-64bf-40d0-ba36-827e2937de58","Type":"ContainerDied","Data":"5524ac9f88d877ff2d5e17e2864d7fa1f95f89b4b6b117790ee5a210a64d8cc5"} Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.304768 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jjwfp" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.324487 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-5fcfc" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.383642 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-combined-ca-bundle\") pod \"6a148250-156e-4b40-969e-3569cea8a403\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.384148 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sk9q\" (UniqueName: \"kubernetes.io/projected/6a148250-156e-4b40-969e-3569cea8a403-kube-api-access-4sk9q\") pod \"6a148250-156e-4b40-969e-3569cea8a403\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.384212 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-config-data\") pod \"6a148250-156e-4b40-969e-3569cea8a403\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.384404 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-scripts\") pod \"6a148250-156e-4b40-969e-3569cea8a403\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.384440 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a148250-156e-4b40-969e-3569cea8a403-logs\") pod \"6a148250-156e-4b40-969e-3569cea8a403\" (UID: \"6a148250-156e-4b40-969e-3569cea8a403\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.384498 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcchl\" (UniqueName: \"kubernetes.io/projected/d587f36a-ade4-499c-b160-673d58efb861-kube-api-access-fcchl\") pod \"d587f36a-ade4-499c-b160-673d58efb861\" (UID: \"d587f36a-ade4-499c-b160-673d58efb861\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.385550 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a148250-156e-4b40-969e-3569cea8a403-logs" (OuterVolumeSpecName: "logs") pod "6a148250-156e-4b40-969e-3569cea8a403" (UID: "6a148250-156e-4b40-969e-3569cea8a403"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.394512 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-scripts" (OuterVolumeSpecName: "scripts") pod "6a148250-156e-4b40-969e-3569cea8a403" (UID: "6a148250-156e-4b40-969e-3569cea8a403"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.394520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a148250-156e-4b40-969e-3569cea8a403-kube-api-access-4sk9q" (OuterVolumeSpecName: "kube-api-access-4sk9q") pod "6a148250-156e-4b40-969e-3569cea8a403" (UID: "6a148250-156e-4b40-969e-3569cea8a403"). InnerVolumeSpecName "kube-api-access-4sk9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.394684 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d587f36a-ade4-499c-b160-673d58efb861-kube-api-access-fcchl" (OuterVolumeSpecName: "kube-api-access-fcchl") pod "d587f36a-ade4-499c-b160-673d58efb861" (UID: "d587f36a-ade4-499c-b160-673d58efb861"). InnerVolumeSpecName "kube-api-access-fcchl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.450666 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-config-data" (OuterVolumeSpecName: "config-data") pod "6a148250-156e-4b40-969e-3569cea8a403" (UID: "6a148250-156e-4b40-969e-3569cea8a403"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.475741 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a148250-156e-4b40-969e-3569cea8a403" (UID: "6a148250-156e-4b40-969e-3569cea8a403"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.478014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564158-5fcfc" event={"ID":"d587f36a-ade4-499c-b160-673d58efb861","Type":"ContainerDied","Data":"80289016c58758a6d69c7a4a0cc9cb56149b24ce707479c847107f71dedb4f31"} Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.478078 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80289016c58758a6d69c7a4a0cc9cb56149b24ce707479c847107f71dedb4f31" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.478199 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564158-5fcfc" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.492951 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sk9q\" (UniqueName: \"kubernetes.io/projected/6a148250-156e-4b40-969e-3569cea8a403-kube-api-access-4sk9q\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.493005 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.493020 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.493031 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a148250-156e-4b40-969e-3569cea8a403-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.493042 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcchl\" (UniqueName: \"kubernetes.io/projected/d587f36a-ade4-499c-b160-673d58efb861-kube-api-access-fcchl\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.493076 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a148250-156e-4b40-969e-3569cea8a403-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.494086 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jjwfp" event={"ID":"6a148250-156e-4b40-969e-3569cea8a403","Type":"ContainerDied","Data":"3591a0ece444e8fb03717aad4a7d5e02539ada35b8350586d7378bc10ed3b98b"} Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.494131 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3591a0ece444e8fb03717aad4a7d5e02539ada35b8350586d7378bc10ed3b98b" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.494366 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jjwfp" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.666131 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-746b5459b4-c8cg4"] Mar 18 15:58:33 crc kubenswrapper[4792]: E0318 15:58:33.666768 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d587f36a-ade4-499c-b160-673d58efb861" containerName="oc" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.666787 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d587f36a-ade4-499c-b160-673d58efb861" containerName="oc" Mar 18 15:58:33 crc kubenswrapper[4792]: E0318 15:58:33.666806 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a148250-156e-4b40-969e-3569cea8a403" containerName="placement-db-sync" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.666814 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a148250-156e-4b40-969e-3569cea8a403" containerName="placement-db-sync" Mar 18 15:58:33 crc kubenswrapper[4792]: E0318 15:58:33.666853 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bab4c01-139c-4d38-81c5-d0630b78c37e" containerName="init" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.666864 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bab4c01-139c-4d38-81c5-d0630b78c37e" containerName="init" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.667144 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bab4c01-139c-4d38-81c5-d0630b78c37e" containerName="init" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.667163 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a148250-156e-4b40-969e-3569cea8a403" containerName="placement-db-sync" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.667181 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d587f36a-ade4-499c-b160-673d58efb861" containerName="oc" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.670861 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.671410 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.677527 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.677819 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.678008 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zwhbp" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.678353 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.680128 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.695756 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbssb\" (UniqueName: \"kubernetes.io/projected/c95141f3-8c44-452b-90b6-cd67b5d42268-kube-api-access-hbssb\") pod \"c95141f3-8c44-452b-90b6-cd67b5d42268\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.695841 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-logs\") pod \"c95141f3-8c44-452b-90b6-cd67b5d42268\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.695950 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-config-data\") pod \"c95141f3-8c44-452b-90b6-cd67b5d42268\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.696115 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"c95141f3-8c44-452b-90b6-cd67b5d42268\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.696225 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-combined-ca-bundle\") pod \"c95141f3-8c44-452b-90b6-cd67b5d42268\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.696297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-httpd-run\") pod \"c95141f3-8c44-452b-90b6-cd67b5d42268\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.696359 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-scripts\") pod \"c95141f3-8c44-452b-90b6-cd67b5d42268\" (UID: \"c95141f3-8c44-452b-90b6-cd67b5d42268\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.696745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-config-data\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.696870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-combined-ca-bundle\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.696938 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4542g\" (UniqueName: \"kubernetes.io/projected/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-kube-api-access-4542g\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.702090 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-public-tls-certs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.702141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-internal-tls-certs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.702240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-scripts\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.702311 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-logs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.702754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c95141f3-8c44-452b-90b6-cd67b5d42268" (UID: "c95141f3-8c44-452b-90b6-cd67b5d42268"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.707392 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-logs" (OuterVolumeSpecName: "logs") pod "c95141f3-8c44-452b-90b6-cd67b5d42268" (UID: "c95141f3-8c44-452b-90b6-cd67b5d42268"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.710109 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-scripts" (OuterVolumeSpecName: "scripts") pod "c95141f3-8c44-452b-90b6-cd67b5d42268" (UID: "c95141f3-8c44-452b-90b6-cd67b5d42268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.724858 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-746b5459b4-c8cg4"] Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.726708 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c95141f3-8c44-452b-90b6-cd67b5d42268-kube-api-access-hbssb" (OuterVolumeSpecName: "kube-api-access-hbssb") pod "c95141f3-8c44-452b-90b6-cd67b5d42268" (UID: "c95141f3-8c44-452b-90b6-cd67b5d42268"). InnerVolumeSpecName "kube-api-access-hbssb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.757443 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5" (OuterVolumeSpecName: "glance") pod "c95141f3-8c44-452b-90b6-cd67b5d42268" (UID: "c95141f3-8c44-452b-90b6-cd67b5d42268"). InnerVolumeSpecName "pvc-19331b78-34cf-4f66-a604-660df9e579f5". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.758750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c95141f3-8c44-452b-90b6-cd67b5d42268" (UID: "c95141f3-8c44-452b-90b6-cd67b5d42268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.800528 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.811484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-scripts\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.811579 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-logs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.811696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-config-data\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.811895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-combined-ca-bundle\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4542g\" (UniqueName: \"kubernetes.io/projected/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-kube-api-access-4542g\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812176 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-public-tls-certs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812209 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-internal-tls-certs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812378 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbssb\" (UniqueName: \"kubernetes.io/projected/c95141f3-8c44-452b-90b6-cd67b5d42268-kube-api-access-hbssb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812397 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812435 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") on node \"crc\" " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812452 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812465 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c95141f3-8c44-452b-90b6-cd67b5d42268-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.812481 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.815962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-logs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.836124 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4542g\" (UniqueName: \"kubernetes.io/projected/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-kube-api-access-4542g\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.838501 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-public-tls-certs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.841321 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-scripts\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.841613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-config-data" (OuterVolumeSpecName: "config-data") pod "c95141f3-8c44-452b-90b6-cd67b5d42268" (UID: "c95141f3-8c44-452b-90b6-cd67b5d42268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.852959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-config-data\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.860516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-combined-ca-bundle\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.864099 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-internal-tls-certs\") pod \"placement-746b5459b4-c8cg4\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.900502 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.900653 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-19331b78-34cf-4f66-a604-660df9e579f5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5") on node "crc" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.929846 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9446\" (UniqueName: \"kubernetes.io/projected/5b2379d9-5953-4e5d-b099-56fbe190c47b-kube-api-access-q9446\") pod \"5b2379d9-5953-4e5d-b099-56fbe190c47b\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.929981 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-config-data\") pod \"5b2379d9-5953-4e5d-b099-56fbe190c47b\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.930014 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-scripts\") pod \"5b2379d9-5953-4e5d-b099-56fbe190c47b\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.931406 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-httpd-run\") pod \"5b2379d9-5953-4e5d-b099-56fbe190c47b\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.931439 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-combined-ca-bundle\") pod \"5b2379d9-5953-4e5d-b099-56fbe190c47b\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.931592 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"5b2379d9-5953-4e5d-b099-56fbe190c47b\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.931639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-logs\") pod \"5b2379d9-5953-4e5d-b099-56fbe190c47b\" (UID: \"5b2379d9-5953-4e5d-b099-56fbe190c47b\") " Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.932602 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95141f3-8c44-452b-90b6-cd67b5d42268-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.932621 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.939003 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-scripts" (OuterVolumeSpecName: "scripts") pod "5b2379d9-5953-4e5d-b099-56fbe190c47b" (UID: "5b2379d9-5953-4e5d-b099-56fbe190c47b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.939339 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5b2379d9-5953-4e5d-b099-56fbe190c47b" (UID: "5b2379d9-5953-4e5d-b099-56fbe190c47b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.945047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-logs" (OuterVolumeSpecName: "logs") pod "5b2379d9-5953-4e5d-b099-56fbe190c47b" (UID: "5b2379d9-5953-4e5d-b099-56fbe190c47b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.950660 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2379d9-5953-4e5d-b099-56fbe190c47b-kube-api-access-q9446" (OuterVolumeSpecName: "kube-api-access-q9446") pod "5b2379d9-5953-4e5d-b099-56fbe190c47b" (UID: "5b2379d9-5953-4e5d-b099-56fbe190c47b"). InnerVolumeSpecName "kube-api-access-q9446". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.978518 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b" (OuterVolumeSpecName: "glance") pod "5b2379d9-5953-4e5d-b099-56fbe190c47b" (UID: "5b2379d9-5953-4e5d-b099-56fbe190c47b"). InnerVolumeSpecName "pvc-8bcf0461-07ee-43b5-b329-fde264578b3b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 15:58:33 crc kubenswrapper[4792]: I0318 15:58:33.988287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b2379d9-5953-4e5d-b099-56fbe190c47b" (UID: "5b2379d9-5953-4e5d-b099-56fbe190c47b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.013737 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.014242 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-config-data" (OuterVolumeSpecName: "config-data") pod "5b2379d9-5953-4e5d-b099-56fbe190c47b" (UID: "5b2379d9-5953-4e5d-b099-56fbe190c47b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.034447 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9446\" (UniqueName: \"kubernetes.io/projected/5b2379d9-5953-4e5d-b099-56fbe190c47b-kube-api-access-q9446\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.034493 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.034508 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.034521 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.034533 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b2379d9-5953-4e5d-b099-56fbe190c47b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.034573 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") on node \"crc\" " Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.034588 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b2379d9-5953-4e5d-b099-56fbe190c47b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.084386 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.084627 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8bcf0461-07ee-43b5-b329-fde264578b3b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b") on node "crc" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.137731 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.191592 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.244495 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-config-data\") pod \"46820dcb-64bf-40d0-ba36-827e2937de58\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.244620 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-credential-keys\") pod \"46820dcb-64bf-40d0-ba36-827e2937de58\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.244647 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-combined-ca-bundle\") pod \"46820dcb-64bf-40d0-ba36-827e2937de58\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.244724 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-scripts\") pod \"46820dcb-64bf-40d0-ba36-827e2937de58\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.244785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-fernet-keys\") pod \"46820dcb-64bf-40d0-ba36-827e2937de58\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.244899 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mbfg\" (UniqueName: \"kubernetes.io/projected/46820dcb-64bf-40d0-ba36-827e2937de58-kube-api-access-5mbfg\") pod \"46820dcb-64bf-40d0-ba36-827e2937de58\" (UID: \"46820dcb-64bf-40d0-ba36-827e2937de58\") " Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.258527 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-scripts" (OuterVolumeSpecName: "scripts") pod "46820dcb-64bf-40d0-ba36-827e2937de58" (UID: "46820dcb-64bf-40d0-ba36-827e2937de58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.268559 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46820dcb-64bf-40d0-ba36-827e2937de58-kube-api-access-5mbfg" (OuterVolumeSpecName: "kube-api-access-5mbfg") pod "46820dcb-64bf-40d0-ba36-827e2937de58" (UID: "46820dcb-64bf-40d0-ba36-827e2937de58"). InnerVolumeSpecName "kube-api-access-5mbfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.268676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "46820dcb-64bf-40d0-ba36-827e2937de58" (UID: "46820dcb-64bf-40d0-ba36-827e2937de58"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.288249 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "46820dcb-64bf-40d0-ba36-827e2937de58" (UID: "46820dcb-64bf-40d0-ba36-827e2937de58"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.293764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46820dcb-64bf-40d0-ba36-827e2937de58" (UID: "46820dcb-64bf-40d0-ba36-827e2937de58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.298364 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-config-data" (OuterVolumeSpecName: "config-data") pod "46820dcb-64bf-40d0-ba36-827e2937de58" (UID: "46820dcb-64bf-40d0-ba36-827e2937de58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.347333 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.347384 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.347398 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mbfg\" (UniqueName: \"kubernetes.io/projected/46820dcb-64bf-40d0-ba36-827e2937de58-kube-api-access-5mbfg\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.347411 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.347425 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.347435 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46820dcb-64bf-40d0-ba36-827e2937de58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.428172 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-9mw9z"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.443360 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564152-9mw9z"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.526258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b2379d9-5953-4e5d-b099-56fbe190c47b","Type":"ContainerDied","Data":"a8c4706e66b76063d854f9aea30a67c0b194925ddcb721ae4a5ebe24d74353ed"} Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.526311 4792 scope.go:117] "RemoveContainer" containerID="728161e17aded932c5b433c2e1316faa5de60de2b6d2fcc16b54f11a1a5ce672" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.526480 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.545256 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nzh9d" event={"ID":"46820dcb-64bf-40d0-ba36-827e2937de58","Type":"ContainerDied","Data":"a8c773b849431af1a623329a2ae174b3ba5bf04b93e36c85c880f70fa947a233"} Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.545520 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8c773b849431af1a623329a2ae174b3ba5bf04b93e36c85c880f70fa947a233" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.545581 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nzh9d" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.603634 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c95141f3-8c44-452b-90b6-cd67b5d42268","Type":"ContainerDied","Data":"59c43ef4a71932eda0c59b358ec3b7001ae40efb73058b621c989572a5dd322d"} Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.603747 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.615704 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.626364 4792 scope.go:117] "RemoveContainer" containerID="ebcad65ca1f0af0ffda3ee08892e64aed0d92dc31f70a297586f8e0a8034cd0a" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.647686 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.669060 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-69977cc675-l62x5"] Mar 18 15:58:34 crc kubenswrapper[4792]: E0318 15:58:34.669673 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46820dcb-64bf-40d0-ba36-827e2937de58" containerName="keystone-bootstrap" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.670945 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="46820dcb-64bf-40d0-ba36-827e2937de58" containerName="keystone-bootstrap" Mar 18 15:58:34 crc kubenswrapper[4792]: E0318 15:58:34.671145 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-httpd" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.671219 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-httpd" Mar 18 15:58:34 crc kubenswrapper[4792]: E0318 15:58:34.671293 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-httpd" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.671346 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-httpd" Mar 18 15:58:34 crc kubenswrapper[4792]: E0318 15:58:34.671422 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-log" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.671474 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-log" Mar 18 15:58:34 crc kubenswrapper[4792]: E0318 15:58:34.671536 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-log" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.671586 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-log" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.671930 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-httpd" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.672030 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-httpd" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.672090 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" containerName="glance-log" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.672161 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" containerName="glance-log" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.672229 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="46820dcb-64bf-40d0-ba36-827e2937de58" containerName="keystone-bootstrap" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.673288 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.681191 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerStarted","Data":"2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12"} Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.690516 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-b84g6" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.690903 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.691079 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.691227 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.691225 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.695615 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.714822 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2l2s" event={"ID":"2f4cf6d4-998f-445b-82ed-25b2b4670875","Type":"ContainerStarted","Data":"38d4f0b33dd4c7c042ecfcb2c172e3ec32a5ca02551c744e047d927ce6862d7f"} Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.728481 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.730587 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.739988 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sw42f" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.740225 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.740382 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.740535 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.809026 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69977cc675-l62x5"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.842876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.857025 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-746b5459b4-c8cg4"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.858537 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-public-tls-certs\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.858723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.858810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.858894 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-internal-tls-certs\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859083 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-credential-keys\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859201 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-config-data\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859279 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9dvj\" (UniqueName: \"kubernetes.io/projected/63e5fc07-9299-40b7-91c9-7a2442362d9a-kube-api-access-j9dvj\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859357 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gtp\" (UniqueName: \"kubernetes.io/projected/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-kube-api-access-c6gtp\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-scripts\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859520 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-fernet-keys\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-scripts\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859656 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-combined-ca-bundle\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859773 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-config-data\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.859981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-logs\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.860582 4792 scope.go:117] "RemoveContainer" containerID="8d1f902cff3f9daacd502980d6fda8381bfaac22710a46bccc6424168acb2831" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.872181 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.900489 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.933415 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.936083 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.940948 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.941232 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.951349 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-k2l2s" podStartSLOduration=4.647141463 podStartE2EDuration="45.95131997s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="2026-03-18 15:57:51.877679398 +0000 UTC m=+1420.747008335" lastFinishedPulling="2026-03-18 15:58:33.181857905 +0000 UTC m=+1462.051186842" observedRunningTime="2026-03-18 15:58:34.750733993 +0000 UTC m=+1463.620062930" watchObservedRunningTime="2026-03-18 15:58:34.95131997 +0000 UTC m=+1463.820648917" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.963419 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.964201 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.964323 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.964421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.964495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.964718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-internal-tls-certs\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.964827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-credential-keys\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.964935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.965581 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-config-data\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.965747 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9dvj\" (UniqueName: \"kubernetes.io/projected/63e5fc07-9299-40b7-91c9-7a2442362d9a-kube-api-access-j9dvj\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.965829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gtp\" (UniqueName: \"kubernetes.io/projected/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-kube-api-access-c6gtp\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.965938 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-scripts\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.966863 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-fernet-keys\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.966991 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-scripts\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.967997 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-combined-ca-bundle\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.969712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.969904 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9bng\" (UniqueName: \"kubernetes.io/projected/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-kube-api-access-x9bng\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.970020 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.970108 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-config-data\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.970193 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-logs\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.967469 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.975919 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.975988 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f4135ef5c687380422f9124ccce113815e08bdbecc9d37bdfa336b12e119b7ff/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.976518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-logs\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.976914 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.977064 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.977100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.977137 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-public-tls-certs\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:34 crc kubenswrapper[4792]: I0318 15:58:34.986888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.064295 4792 scope.go:117] "RemoveContainer" containerID="f8c281769cfc18dd784e8fa9fbbc03a10ef8ed0a8040e7c26bdd334408d7d76c" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.080678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.080730 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.080810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.080868 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.080951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.081118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.081165 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9bng\" (UniqueName: \"kubernetes.io/projected/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-kube-api-access-x9bng\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.081235 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.086231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.105435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.113693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.119095 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.119534 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.121321 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-internal-tls-certs\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.121739 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-scripts\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.124080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.130846 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-credential-keys\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.131055 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.131083 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7b0cea5cb737a527758633cd5e46883214d92b47b2dcb8de95269a535706d518/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.135909 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-fernet-keys\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.150391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-config-data\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.150598 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-scripts\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.151767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-combined-ca-bundle\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.154749 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-config-data\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.154831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63e5fc07-9299-40b7-91c9-7a2442362d9a-public-tls-certs\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.162589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9dvj\" (UniqueName: \"kubernetes.io/projected/63e5fc07-9299-40b7-91c9-7a2442362d9a-kube-api-access-j9dvj\") pod \"keystone-69977cc675-l62x5\" (UID: \"63e5fc07-9299-40b7-91c9-7a2442362d9a\") " pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.162789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.162827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gtp\" (UniqueName: \"kubernetes.io/projected/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-kube-api-access-c6gtp\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.164195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.171098 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9bng\" (UniqueName: \"kubernetes.io/projected/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-kube-api-access-x9bng\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.321988 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-79db488c56-wbmrk"] Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.324506 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.342346 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.343627 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " pod="openstack/glance-default-external-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.391685 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-internal-tls-certs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.391796 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x55lf\" (UniqueName: \"kubernetes.io/projected/371480fb-9244-4210-9db0-30d2fffdc422-kube-api-access-x55lf\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.391840 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-public-tls-certs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.391860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-config-data\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.391892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-combined-ca-bundle\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.391952 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-scripts\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.393149 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/371480fb-9244-4210-9db0-30d2fffdc422-logs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.403602 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79db488c56-wbmrk"] Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.424727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.444943 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.496526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-internal-tls-certs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.496605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x55lf\" (UniqueName: \"kubernetes.io/projected/371480fb-9244-4210-9db0-30d2fffdc422-kube-api-access-x55lf\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.496635 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-public-tls-certs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.496653 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-config-data\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.496673 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-combined-ca-bundle\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.496719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-scripts\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.496740 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/371480fb-9244-4210-9db0-30d2fffdc422-logs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.497256 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/371480fb-9244-4210-9db0-30d2fffdc422-logs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.505633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-internal-tls-certs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.505653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-combined-ca-bundle\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.508340 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-public-tls-certs\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.508585 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-scripts\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.508817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/371480fb-9244-4210-9db0-30d2fffdc422-config-data\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.523439 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x55lf\" (UniqueName: \"kubernetes.io/projected/371480fb-9244-4210-9db0-30d2fffdc422-kube-api-access-x55lf\") pod \"placement-79db488c56-wbmrk\" (UID: \"371480fb-9244-4210-9db0-30d2fffdc422\") " pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.585627 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.593698 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.615953 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.737230 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-dhpb9"] Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.737693 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" podUID="ff35f858-03ee-44eb-8333-2f426efa6281" containerName="dnsmasq-dns" containerID="cri-o://d69bf31ecbcf4ba50c202b35804067622a79483ed926ca7b8f789d47a973bb48" gracePeriod=10 Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.772839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-746b5459b4-c8cg4" event={"ID":"d22f1eb2-0bf5-463c-859d-d7dd50b11a70","Type":"ContainerStarted","Data":"a81b893557361b5cc2a7a3cceb7b3e34d33bcb07d06a4567a75970dcaf113370"} Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.818594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pfgr6" event={"ID":"379ff25e-6c0a-45d4-a478-87a5e136aa47","Type":"ContainerStarted","Data":"fb80b33f4c260a249e77205e69bd3991689f30d26a856db44390982be8e4d6a0"} Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.840201 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-pfgr6" podStartSLOduration=3.908955771 podStartE2EDuration="46.840178533s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="2026-03-18 15:57:51.424667298 +0000 UTC m=+1420.293996235" lastFinishedPulling="2026-03-18 15:58:34.35589006 +0000 UTC m=+1463.225218997" observedRunningTime="2026-03-18 15:58:35.836330682 +0000 UTC m=+1464.705659639" watchObservedRunningTime="2026-03-18 15:58:35.840178533 +0000 UTC m=+1464.709507470" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.959846 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cab244a-8839-4173-930c-b5fed3a6fde1" path="/var/lib/kubelet/pods/3cab244a-8839-4173-930c-b5fed3a6fde1/volumes" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.960634 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2379d9-5953-4e5d-b099-56fbe190c47b" path="/var/lib/kubelet/pods/5b2379d9-5953-4e5d-b099-56fbe190c47b/volumes" Mar 18 15:58:35 crc kubenswrapper[4792]: I0318 15:58:35.961338 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c95141f3-8c44-452b-90b6-cd67b5d42268" path="/var/lib/kubelet/pods/c95141f3-8c44-452b-90b6-cd67b5d42268/volumes" Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.150201 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69977cc675-l62x5"] Mar 18 15:58:36 crc kubenswrapper[4792]: W0318 15:58:36.163258 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63e5fc07_9299_40b7_91c9_7a2442362d9a.slice/crio-dc3a47efa2e2697b20b3465dd459fc68d32f41c9f2155f1f5be3e376f6d2ea40 WatchSource:0}: Error finding container dc3a47efa2e2697b20b3465dd459fc68d32f41c9f2155f1f5be3e376f6d2ea40: Status 404 returned error can't find the container with id dc3a47efa2e2697b20b3465dd459fc68d32f41c9f2155f1f5be3e376f6d2ea40 Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.685243 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79db488c56-wbmrk"] Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.830441 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.892179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69977cc675-l62x5" event={"ID":"63e5fc07-9299-40b7-91c9-7a2442362d9a","Type":"ContainerStarted","Data":"3a94d19e951850a065f63b0b6713e28a9cb98fc038910ed31aaa9da1422dfb9d"} Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.892233 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69977cc675-l62x5" event={"ID":"63e5fc07-9299-40b7-91c9-7a2442362d9a","Type":"ContainerStarted","Data":"dc3a47efa2e2697b20b3465dd459fc68d32f41c9f2155f1f5be3e376f6d2ea40"} Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.905787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-746b5459b4-c8cg4" event={"ID":"d22f1eb2-0bf5-463c-859d-d7dd50b11a70","Type":"ContainerStarted","Data":"1a7e86a6f62a9787b04ec1e69e1a5bc737094190167bec40078f24fb183d7bf2"} Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.950267 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.972224 4792 generic.go:334] "Generic (PLEG): container finished" podID="ff35f858-03ee-44eb-8333-2f426efa6281" containerID="d69bf31ecbcf4ba50c202b35804067622a79483ed926ca7b8f789d47a973bb48" exitCode=0 Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.972331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" event={"ID":"ff35f858-03ee-44eb-8333-2f426efa6281","Type":"ContainerDied","Data":"d69bf31ecbcf4ba50c202b35804067622a79483ed926ca7b8f789d47a973bb48"} Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.982858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79db488c56-wbmrk" event={"ID":"371480fb-9244-4210-9db0-30d2fffdc422","Type":"ContainerStarted","Data":"4119b6151903c3e0fdda579fc4bdf0caeb47053d199cff11a611e465e3e99663"} Mar 18 15:58:36 crc kubenswrapper[4792]: I0318 15:58:36.992651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ncpfm" event={"ID":"6967171c-e427-4723-ae1e-25e3bad61d59","Type":"ContainerStarted","Data":"c7c8c1f8ff9a63368d44b46282339ef63381908aeb1144e538578474823bc61e"} Mar 18 15:58:36 crc kubenswrapper[4792]: W0318 15:58:36.993007 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1f9ef10_a6e9_49ac_bbcd_9786c9e08bdb.slice/crio-f2f2e10b307366e85eb321522a824018c769c9876c759822963d016084516447 WatchSource:0}: Error finding container f2f2e10b307366e85eb321522a824018c769c9876c759822963d016084516447: Status 404 returned error can't find the container with id f2f2e10b307366e85eb321522a824018c769c9876c759822963d016084516447 Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.030391 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ncpfm" podStartSLOduration=5.95197243 podStartE2EDuration="48.03037108s" podCreationTimestamp="2026-03-18 15:57:49 +0000 UTC" firstStartedPulling="2026-03-18 15:57:52.359390432 +0000 UTC m=+1421.228719369" lastFinishedPulling="2026-03-18 15:58:34.437789072 +0000 UTC m=+1463.307118019" observedRunningTime="2026-03-18 15:58:37.028239893 +0000 UTC m=+1465.897568830" watchObservedRunningTime="2026-03-18 15:58:37.03037108 +0000 UTC m=+1465.899700017" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.143545 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.284741 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-268r8\" (UniqueName: \"kubernetes.io/projected/ff35f858-03ee-44eb-8333-2f426efa6281-kube-api-access-268r8\") pod \"ff35f858-03ee-44eb-8333-2f426efa6281\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.284812 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-nb\") pod \"ff35f858-03ee-44eb-8333-2f426efa6281\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.284960 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-swift-storage-0\") pod \"ff35f858-03ee-44eb-8333-2f426efa6281\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.285121 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-svc\") pod \"ff35f858-03ee-44eb-8333-2f426efa6281\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.285166 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-sb\") pod \"ff35f858-03ee-44eb-8333-2f426efa6281\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.285209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-config\") pod \"ff35f858-03ee-44eb-8333-2f426efa6281\" (UID: \"ff35f858-03ee-44eb-8333-2f426efa6281\") " Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.329323 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff35f858-03ee-44eb-8333-2f426efa6281-kube-api-access-268r8" (OuterVolumeSpecName: "kube-api-access-268r8") pod "ff35f858-03ee-44eb-8333-2f426efa6281" (UID: "ff35f858-03ee-44eb-8333-2f426efa6281"). InnerVolumeSpecName "kube-api-access-268r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.392592 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-268r8\" (UniqueName: \"kubernetes.io/projected/ff35f858-03ee-44eb-8333-2f426efa6281-kube-api-access-268r8\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.474335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-config" (OuterVolumeSpecName: "config") pod "ff35f858-03ee-44eb-8333-2f426efa6281" (UID: "ff35f858-03ee-44eb-8333-2f426efa6281"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.494205 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.529946 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff35f858-03ee-44eb-8333-2f426efa6281" (UID: "ff35f858-03ee-44eb-8333-2f426efa6281"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.543271 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff35f858-03ee-44eb-8333-2f426efa6281" (UID: "ff35f858-03ee-44eb-8333-2f426efa6281"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.546657 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff35f858-03ee-44eb-8333-2f426efa6281" (UID: "ff35f858-03ee-44eb-8333-2f426efa6281"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.581807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff35f858-03ee-44eb-8333-2f426efa6281" (UID: "ff35f858-03ee-44eb-8333-2f426efa6281"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.601542 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.601575 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.601586 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:37 crc kubenswrapper[4792]: I0318 15:58:37.601600 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff35f858-03ee-44eb-8333-2f426efa6281-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.111638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" event={"ID":"ff35f858-03ee-44eb-8333-2f426efa6281","Type":"ContainerDied","Data":"54e965186b30467eda031c1a615ed2387546ed0a0565a4ec76d4254d4e09e695"} Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.111701 4792 scope.go:117] "RemoveContainer" containerID="d69bf31ecbcf4ba50c202b35804067622a79483ed926ca7b8f789d47a973bb48" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.111716 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-dhpb9" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.117235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79db488c56-wbmrk" event={"ID":"371480fb-9244-4210-9db0-30d2fffdc422","Type":"ContainerStarted","Data":"b0baecec52a73ad0d8ce20fa01f9eaf7f48c9b769cf872e43082a0558a92c31e"} Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.155910 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-dhpb9"] Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.162367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb","Type":"ContainerStarted","Data":"f2f2e10b307366e85eb321522a824018c769c9876c759822963d016084516447"} Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.166721 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-dhpb9"] Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.172999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e","Type":"ContainerStarted","Data":"b20d86640397a3070507e865b74858fef67c93c85ea8ddb9f09ad594bfab4a85"} Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.175715 4792 scope.go:117] "RemoveContainer" containerID="4ec3145a609be8415bcaa05415461b41b58066f8b531dd9dbd4c32471649b088" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.184358 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-746b5459b4-c8cg4" event={"ID":"d22f1eb2-0bf5-463c-859d-d7dd50b11a70","Type":"ContainerStarted","Data":"8a2a66e0e6d953baf012fd12252c65d6f4230c2b9665d95e82df84d33bf48d79"} Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.184438 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.184510 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.184529 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.225300 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-746b5459b4-c8cg4" podStartSLOduration=5.225278146 podStartE2EDuration="5.225278146s" podCreationTimestamp="2026-03-18 15:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:38.217311114 +0000 UTC m=+1467.086640071" watchObservedRunningTime="2026-03-18 15:58:38.225278146 +0000 UTC m=+1467.094607093" Mar 18 15:58:38 crc kubenswrapper[4792]: I0318 15:58:38.297561 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-69977cc675-l62x5" podStartSLOduration=4.297534322 podStartE2EDuration="4.297534322s" podCreationTimestamp="2026-03-18 15:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:38.271239231 +0000 UTC m=+1467.140568178" watchObservedRunningTime="2026-03-18 15:58:38.297534322 +0000 UTC m=+1467.166863329" Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.205614 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79db488c56-wbmrk" event={"ID":"371480fb-9244-4210-9db0-30d2fffdc422","Type":"ContainerStarted","Data":"653397efea0d349230be75f83827b581ac607c96271c8b1bda06060601cb13e7"} Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.206297 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.206318 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.210468 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb","Type":"ContainerStarted","Data":"ad5fb530529a4dbccd795cc78a5b9e61b1472a301ede8f4476b67f8fb2bed1ed"} Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.213082 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e","Type":"ContainerStarted","Data":"91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531"} Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.213952 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e","Type":"ContainerStarted","Data":"970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0"} Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.233267 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-79db488c56-wbmrk" podStartSLOduration=4.233246878 podStartE2EDuration="4.233246878s" podCreationTimestamp="2026-03-18 15:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:39.226052991 +0000 UTC m=+1468.095381928" watchObservedRunningTime="2026-03-18 15:58:39.233246878 +0000 UTC m=+1468.102575815" Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.311229 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.311196774 podStartE2EDuration="5.311196774s" podCreationTimestamp="2026-03-18 15:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:39.281573008 +0000 UTC m=+1468.150901945" watchObservedRunningTime="2026-03-18 15:58:39.311196774 +0000 UTC m=+1468.180525711" Mar 18 15:58:39 crc kubenswrapper[4792]: I0318 15:58:39.877358 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff35f858-03ee-44eb-8333-2f426efa6281" path="/var/lib/kubelet/pods/ff35f858-03ee-44eb-8333-2f426efa6281/volumes" Mar 18 15:58:40 crc kubenswrapper[4792]: I0318 15:58:40.239284 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb","Type":"ContainerStarted","Data":"f8888569eee935a1c69bd0a3ce7065b1bc1591660f3caa6f971de14a046dee38"} Mar 18 15:58:40 crc kubenswrapper[4792]: I0318 15:58:40.241574 4792 generic.go:334] "Generic (PLEG): container finished" podID="2f4cf6d4-998f-445b-82ed-25b2b4670875" containerID="38d4f0b33dd4c7c042ecfcb2c172e3ec32a5ca02551c744e047d927ce6862d7f" exitCode=0 Mar 18 15:58:40 crc kubenswrapper[4792]: I0318 15:58:40.241680 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2l2s" event={"ID":"2f4cf6d4-998f-445b-82ed-25b2b4670875","Type":"ContainerDied","Data":"38d4f0b33dd4c7c042ecfcb2c172e3ec32a5ca02551c744e047d927ce6862d7f"} Mar 18 15:58:40 crc kubenswrapper[4792]: I0318 15:58:40.275146 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.275120364 podStartE2EDuration="6.275120364s" podCreationTimestamp="2026-03-18 15:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:40.272217792 +0000 UTC m=+1469.141546759" watchObservedRunningTime="2026-03-18 15:58:40.275120364 +0000 UTC m=+1469.144449301" Mar 18 15:58:43 crc kubenswrapper[4792]: I0318 15:58:43.294257 4792 generic.go:334] "Generic (PLEG): container finished" podID="379ff25e-6c0a-45d4-a478-87a5e136aa47" containerID="fb80b33f4c260a249e77205e69bd3991689f30d26a856db44390982be8e4d6a0" exitCode=0 Mar 18 15:58:43 crc kubenswrapper[4792]: I0318 15:58:43.294608 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pfgr6" event={"ID":"379ff25e-6c0a-45d4-a478-87a5e136aa47","Type":"ContainerDied","Data":"fb80b33f4c260a249e77205e69bd3991689f30d26a856db44390982be8e4d6a0"} Mar 18 15:58:44 crc kubenswrapper[4792]: I0318 15:58:44.306175 4792 generic.go:334] "Generic (PLEG): container finished" podID="6967171c-e427-4723-ae1e-25e3bad61d59" containerID="c7c8c1f8ff9a63368d44b46282339ef63381908aeb1144e538578474823bc61e" exitCode=0 Mar 18 15:58:44 crc kubenswrapper[4792]: I0318 15:58:44.306345 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ncpfm" event={"ID":"6967171c-e427-4723-ae1e-25e3bad61d59","Type":"ContainerDied","Data":"c7c8c1f8ff9a63368d44b46282339ef63381908aeb1144e538578474823bc61e"} Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.428365 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.448866 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.448956 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.512303 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.516778 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-combined-ca-bundle\") pod \"2f4cf6d4-998f-445b-82ed-25b2b4670875\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.516861 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmkj\" (UniqueName: \"kubernetes.io/projected/2f4cf6d4-998f-445b-82ed-25b2b4670875-kube-api-access-4hmkj\") pod \"2f4cf6d4-998f-445b-82ed-25b2b4670875\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.516994 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-db-sync-config-data\") pod \"2f4cf6d4-998f-445b-82ed-25b2b4670875\" (UID: \"2f4cf6d4-998f-445b-82ed-25b2b4670875\") " Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.522943 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2f4cf6d4-998f-445b-82ed-25b2b4670875" (UID: "2f4cf6d4-998f-445b-82ed-25b2b4670875"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.523512 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4cf6d4-998f-445b-82ed-25b2b4670875-kube-api-access-4hmkj" (OuterVolumeSpecName: "kube-api-access-4hmkj") pod "2f4cf6d4-998f-445b-82ed-25b2b4670875" (UID: "2f4cf6d4-998f-445b-82ed-25b2b4670875"). InnerVolumeSpecName "kube-api-access-4hmkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.526317 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.578436 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f4cf6d4-998f-445b-82ed-25b2b4670875" (UID: "2f4cf6d4-998f-445b-82ed-25b2b4670875"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.586880 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.588377 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.619957 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.620013 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmkj\" (UniqueName: \"kubernetes.io/projected/2f4cf6d4-998f-445b-82ed-25b2b4670875-kube-api-access-4hmkj\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.620028 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4cf6d4-998f-445b-82ed-25b2b4670875-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.640446 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:45 crc kubenswrapper[4792]: I0318 15:58:45.652535 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.011994 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.019672 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pfgr6" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.140114 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-combined-ca-bundle\") pod \"379ff25e-6c0a-45d4-a478-87a5e136aa47\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.140577 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-db-sync-config-data\") pod \"6967171c-e427-4723-ae1e-25e3bad61d59\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.140605 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6967171c-e427-4723-ae1e-25e3bad61d59-etc-machine-id\") pod \"6967171c-e427-4723-ae1e-25e3bad61d59\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.140670 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqdmz\" (UniqueName: \"kubernetes.io/projected/379ff25e-6c0a-45d4-a478-87a5e136aa47-kube-api-access-qqdmz\") pod \"379ff25e-6c0a-45d4-a478-87a5e136aa47\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.140699 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-scripts\") pod \"6967171c-e427-4723-ae1e-25e3bad61d59\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.140720 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-config-data\") pod \"379ff25e-6c0a-45d4-a478-87a5e136aa47\" (UID: \"379ff25e-6c0a-45d4-a478-87a5e136aa47\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.140786 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6967171c-e427-4723-ae1e-25e3bad61d59-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6967171c-e427-4723-ae1e-25e3bad61d59" (UID: "6967171c-e427-4723-ae1e-25e3bad61d59"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.141298 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-combined-ca-bundle\") pod \"6967171c-e427-4723-ae1e-25e3bad61d59\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.141352 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-config-data\") pod \"6967171c-e427-4723-ae1e-25e3bad61d59\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.141393 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lbm4\" (UniqueName: \"kubernetes.io/projected/6967171c-e427-4723-ae1e-25e3bad61d59-kube-api-access-8lbm4\") pod \"6967171c-e427-4723-ae1e-25e3bad61d59\" (UID: \"6967171c-e427-4723-ae1e-25e3bad61d59\") " Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.142136 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6967171c-e427-4723-ae1e-25e3bad61d59-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.144441 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6967171c-e427-4723-ae1e-25e3bad61d59" (UID: "6967171c-e427-4723-ae1e-25e3bad61d59"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.144504 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379ff25e-6c0a-45d4-a478-87a5e136aa47-kube-api-access-qqdmz" (OuterVolumeSpecName: "kube-api-access-qqdmz") pod "379ff25e-6c0a-45d4-a478-87a5e136aa47" (UID: "379ff25e-6c0a-45d4-a478-87a5e136aa47"). InnerVolumeSpecName "kube-api-access-qqdmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.145104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-scripts" (OuterVolumeSpecName: "scripts") pod "6967171c-e427-4723-ae1e-25e3bad61d59" (UID: "6967171c-e427-4723-ae1e-25e3bad61d59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.145898 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6967171c-e427-4723-ae1e-25e3bad61d59-kube-api-access-8lbm4" (OuterVolumeSpecName: "kube-api-access-8lbm4") pod "6967171c-e427-4723-ae1e-25e3bad61d59" (UID: "6967171c-e427-4723-ae1e-25e3bad61d59"). InnerVolumeSpecName "kube-api-access-8lbm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.177447 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6967171c-e427-4723-ae1e-25e3bad61d59" (UID: "6967171c-e427-4723-ae1e-25e3bad61d59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.181922 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "379ff25e-6c0a-45d4-a478-87a5e136aa47" (UID: "379ff25e-6c0a-45d4-a478-87a5e136aa47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.205903 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-config-data" (OuterVolumeSpecName: "config-data") pod "6967171c-e427-4723-ae1e-25e3bad61d59" (UID: "6967171c-e427-4723-ae1e-25e3bad61d59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.237180 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-config-data" (OuterVolumeSpecName: "config-data") pod "379ff25e-6c0a-45d4-a478-87a5e136aa47" (UID: "379ff25e-6c0a-45d4-a478-87a5e136aa47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.243635 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.243776 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqdmz\" (UniqueName: \"kubernetes.io/projected/379ff25e-6c0a-45d4-a478-87a5e136aa47-kube-api-access-qqdmz\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.243841 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.243903 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.244067 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.244135 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6967171c-e427-4723-ae1e-25e3bad61d59-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.244216 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lbm4\" (UniqueName: \"kubernetes.io/projected/6967171c-e427-4723-ae1e-25e3bad61d59-kube-api-access-8lbm4\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.244291 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ff25e-6c0a-45d4-a478-87a5e136aa47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:46 crc kubenswrapper[4792]: E0318 15:58:46.252513 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.333304 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerStarted","Data":"12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96"} Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.333371 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="ceilometer-notification-agent" containerID="cri-o://1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598" gracePeriod=30 Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.333393 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="proxy-httpd" containerID="cri-o://12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96" gracePeriod=30 Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.333406 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.333467 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="sg-core" containerID="cri-o://2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12" gracePeriod=30 Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.335219 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2l2s" event={"ID":"2f4cf6d4-998f-445b-82ed-25b2b4670875","Type":"ContainerDied","Data":"82dd6283f71e0d4188a5e633e1108f38df1dfe9f745fd16552c9f21e82206f4f"} Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.335247 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82dd6283f71e0d4188a5e633e1108f38df1dfe9f745fd16552c9f21e82206f4f" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.335368 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2l2s" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.338674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pfgr6" event={"ID":"379ff25e-6c0a-45d4-a478-87a5e136aa47","Type":"ContainerDied","Data":"b864647b670415dbf3392e795c29ad9f260f453409badad4631dd08ed392c99d"} Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.338717 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b864647b670415dbf3392e795c29ad9f260f453409badad4631dd08ed392c99d" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.338779 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pfgr6" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.353494 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ncpfm" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.353564 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ncpfm" event={"ID":"6967171c-e427-4723-ae1e-25e3bad61d59","Type":"ContainerDied","Data":"c002043694cb4ff2d58d14c6dcc32093e1ce955dd9ecf417032b4dd31e4ba21c"} Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.353600 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c002043694cb4ff2d58d14c6dcc32093e1ce955dd9ecf417032b4dd31e4ba21c" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.354923 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.355009 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.355034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.355049 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.592226 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:46 crc kubenswrapper[4792]: E0318 15:58:46.593889 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379ff25e-6c0a-45d4-a478-87a5e136aa47" containerName="heat-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.593998 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="379ff25e-6c0a-45d4-a478-87a5e136aa47" containerName="heat-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: E0318 15:58:46.594074 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff35f858-03ee-44eb-8333-2f426efa6281" containerName="dnsmasq-dns" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594137 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff35f858-03ee-44eb-8333-2f426efa6281" containerName="dnsmasq-dns" Mar 18 15:58:46 crc kubenswrapper[4792]: E0318 15:58:46.594196 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4cf6d4-998f-445b-82ed-25b2b4670875" containerName="barbican-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594253 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4cf6d4-998f-445b-82ed-25b2b4670875" containerName="barbican-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: E0318 15:58:46.594317 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6967171c-e427-4723-ae1e-25e3bad61d59" containerName="cinder-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594379 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6967171c-e427-4723-ae1e-25e3bad61d59" containerName="cinder-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: E0318 15:58:46.594444 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff35f858-03ee-44eb-8333-2f426efa6281" containerName="init" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594499 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff35f858-03ee-44eb-8333-2f426efa6281" containerName="init" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594758 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4cf6d4-998f-445b-82ed-25b2b4670875" containerName="barbican-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594831 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="379ff25e-6c0a-45d4-a478-87a5e136aa47" containerName="heat-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594894 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff35f858-03ee-44eb-8333-2f426efa6281" containerName="dnsmasq-dns" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.594957 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6967171c-e427-4723-ae1e-25e3bad61d59" containerName="cinder-db-sync" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.596177 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.602794 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.603098 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fr4wl" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.603155 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.603298 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.616888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.699069 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-774db89647-lgxjn"] Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.701281 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.740494 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774db89647-lgxjn"] Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.762933 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.763035 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-scripts\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.763096 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48439a9-65ab-4389-8776-80321175e016-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.763179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.763231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tnb\" (UniqueName: \"kubernetes.io/projected/d48439a9-65ab-4389-8776-80321175e016-kube-api-access-b2tnb\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.763262 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.809288 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.811663 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.815610 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.824323 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48439a9-65ab-4389-8776-80321175e016-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865268 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-svc\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865390 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2tnb\" (UniqueName: \"kubernetes.io/projected/d48439a9-65ab-4389-8776-80321175e016-kube-api-access-b2tnb\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhf25\" (UniqueName: \"kubernetes.io/projected/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-kube-api-access-hhf25\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-scripts\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-config\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.865637 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48439a9-65ab-4389-8776-80321175e016-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.877088 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.879108 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-scripts\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.879179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.890078 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.933261 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2tnb\" (UniqueName: \"kubernetes.io/projected/d48439a9-65ab-4389-8776-80321175e016-kube-api-access-b2tnb\") pod \"cinder-scheduler-0\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.944821 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.967197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhf25\" (UniqueName: \"kubernetes.io/projected/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-kube-api-access-hhf25\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980220 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980321 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-config\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-logs\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980559 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980627 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-svc\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data-custom\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980766 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980869 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlg6g\" (UniqueName: \"kubernetes.io/projected/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-kube-api-access-dlg6g\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.980985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-scripts\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.982415 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.983023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-config\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.983621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.984331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-svc\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.984810 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.991438 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d8b96c4f5-87qzw"] Mar 18 15:58:46 crc kubenswrapper[4792]: I0318 15:58:46.993320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.001510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qfds7" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.001789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.015379 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.031032 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d8b96c4f5-87qzw"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.033937 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhf25\" (UniqueName: \"kubernetes.io/projected/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-kube-api-access-hhf25\") pod \"dnsmasq-dns-774db89647-lgxjn\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.062653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.090468 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7bf8bcc748-g9kq5"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.092838 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.097230 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-logs\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098812 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098842 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5rh\" (UniqueName: \"kubernetes.io/projected/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-kube-api-access-6z5rh\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-combined-ca-bundle\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098891 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098909 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data-custom\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.098990 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlg6g\" (UniqueName: \"kubernetes.io/projected/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-kube-api-access-dlg6g\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.099040 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-scripts\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.099072 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-config-data\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.099117 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-config-data-custom\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.099148 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-logs\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.099245 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.099727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-logs\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.124674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.126489 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data-custom\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.126489 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-scripts\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.127228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.168506 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlg6g\" (UniqueName: \"kubernetes.io/projected/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-kube-api-access-dlg6g\") pod \"cinder-api-0\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.212739 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/442ae180-60ac-4d2c-92eb-b9a823ba74a9-logs\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.220791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-config-data\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.221112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-config-data-custom\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.221220 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-config-data-custom\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.221310 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-config-data\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.221417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-logs\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.221578 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4tnb\" (UniqueName: \"kubernetes.io/projected/442ae180-60ac-4d2c-92eb-b9a823ba74a9-kube-api-access-n4tnb\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.221764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-combined-ca-bundle\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.221947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5rh\" (UniqueName: \"kubernetes.io/projected/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-kube-api-access-6z5rh\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.222089 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-combined-ca-bundle\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.225900 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7bf8bcc748-g9kq5"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.226244 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-logs\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.238127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-combined-ca-bundle\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.251090 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-config-data-custom\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.254759 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-lgxjn"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.262862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-config-data\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.270646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5rh\" (UniqueName: \"kubernetes.io/projected/fbcb88bd-9c5a-4e8f-bbe2-0109d7751292-kube-api-access-6z5rh\") pod \"barbican-worker-7d8b96c4f5-87qzw\" (UID: \"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292\") " pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.325525 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8khs2"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.328504 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/442ae180-60ac-4d2c-92eb-b9a823ba74a9-logs\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.328589 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-config-data-custom\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.328630 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-config-data\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.328690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.328704 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4tnb\" (UniqueName: \"kubernetes.io/projected/442ae180-60ac-4d2c-92eb-b9a823ba74a9-kube-api-access-n4tnb\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.328771 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-combined-ca-bundle\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.345537 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/442ae180-60ac-4d2c-92eb-b9a823ba74a9-logs\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.369576 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-config-data\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.369931 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-combined-ca-bundle\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.370379 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/442ae180-60ac-4d2c-92eb-b9a823ba74a9-config-data-custom\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.379098 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8khs2"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.393991 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4tnb\" (UniqueName: \"kubernetes.io/projected/442ae180-60ac-4d2c-92eb-b9a823ba74a9-kube-api-access-n4tnb\") pod \"barbican-keystone-listener-7bf8bcc748-g9kq5\" (UID: \"442ae180-60ac-4d2c-92eb-b9a823ba74a9\") " pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.406450 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85d596b878-mgtbk"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.477629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.490629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d8b96c4f5-87qzw" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.544441 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.582854 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.582916 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.582957 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.585414 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.585609 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-config\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.585862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2sd\" (UniqueName: \"kubernetes.io/projected/5378b261-2167-4eae-a672-0dd816a99a18-kube-api-access-lv2sd\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.587470 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerID="2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12" exitCode=2 Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.587668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerDied","Data":"2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12"} Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.644716 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.708497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.708608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-config\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.708698 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2sd\" (UniqueName: \"kubernetes.io/projected/5378b261-2167-4eae-a672-0dd816a99a18-kube-api-access-lv2sd\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.708752 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.708793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.708817 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.711863 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.712449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-config\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.713058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.713585 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.719706 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85d596b878-mgtbk"] Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.723412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.733242 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.804358 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2sd\" (UniqueName: \"kubernetes.io/projected/5378b261-2167-4eae-a672-0dd816a99a18-kube-api-access-lv2sd\") pod \"dnsmasq-dns-6578955fd5-8khs2\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.812598 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-combined-ca-bundle\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.812800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data-custom\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.819472 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4b1400-97cc-426c-9364-8d03b7c43037-logs\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.819540 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.819589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfhjp\" (UniqueName: \"kubernetes.io/projected/ac4b1400-97cc-426c-9364-8d03b7c43037-kube-api-access-pfhjp\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.922568 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4b1400-97cc-426c-9364-8d03b7c43037-logs\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.922618 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.922639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfhjp\" (UniqueName: \"kubernetes.io/projected/ac4b1400-97cc-426c-9364-8d03b7c43037-kube-api-access-pfhjp\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.922768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-combined-ca-bundle\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.922923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data-custom\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.926374 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4b1400-97cc-426c-9364-8d03b7c43037-logs\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.931962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data-custom\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.940918 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.943717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-combined-ca-bundle\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:47 crc kubenswrapper[4792]: I0318 15:58:47.959719 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfhjp\" (UniqueName: \"kubernetes.io/projected/ac4b1400-97cc-426c-9364-8d03b7c43037-kube-api-access-pfhjp\") pod \"barbican-api-85d596b878-mgtbk\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.058632 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.087808 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.095664 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.264741 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-lgxjn"] Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.525153 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7bf8bcc748-g9kq5"] Mar 18 15:58:48 crc kubenswrapper[4792]: E0318 15:58:48.616322 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeaa1087_1d8f_4041_a5c6_506f3f2a14fb.slice/crio-conmon-6daf7967b4fe0584f3736dc5540a31f35c8a54c8425775b5dee887bf53d516e3.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.626385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" event={"ID":"442ae180-60ac-4d2c-92eb-b9a823ba74a9","Type":"ContainerStarted","Data":"b42cdc3336b24c197fd0bf57dcd705561d376c33d00b0dde11ef93ba3bd27c1c"} Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.628657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d48439a9-65ab-4389-8776-80321175e016","Type":"ContainerStarted","Data":"e137849d4ed0dc225d28d86f5ab5ed2ddd2d2e2eb0fa486a7cc55a182409066e"} Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.641216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-lgxjn" event={"ID":"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb","Type":"ContainerStarted","Data":"adf33c0273c3969af48592a0f17620bc86901d9bc5550b95de58a0e9a1b9ef3b"} Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.807986 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:58:48 crc kubenswrapper[4792]: I0318 15:58:48.831764 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d8b96c4f5-87qzw"] Mar 18 15:58:48 crc kubenswrapper[4792]: W0318 15:58:48.849915 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbcb88bd_9c5a_4e8f_bbe2_0109d7751292.slice/crio-1ed4e24385f11da81a80b04cc16c0aff37583338fcd11accedb72128c30a4a27 WatchSource:0}: Error finding container 1ed4e24385f11da81a80b04cc16c0aff37583338fcd11accedb72128c30a4a27: Status 404 returned error can't find the container with id 1ed4e24385f11da81a80b04cc16c0aff37583338fcd11accedb72128c30a4a27 Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.141664 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8khs2"] Mar 18 15:58:49 crc kubenswrapper[4792]: W0318 15:58:49.145233 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5378b261_2167_4eae_a672_0dd816a99a18.slice/crio-525030698e46ee18d89cda86a3cf3c253dd290a2f6aa53292c7225dd4795c5ac WatchSource:0}: Error finding container 525030698e46ee18d89cda86a3cf3c253dd290a2f6aa53292c7225dd4795c5ac: Status 404 returned error can't find the container with id 525030698e46ee18d89cda86a3cf3c253dd290a2f6aa53292c7225dd4795c5ac Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.318620 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85d596b878-mgtbk"] Mar 18 15:58:49 crc kubenswrapper[4792]: W0318 15:58:49.354095 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac4b1400_97cc_426c_9364_8d03b7c43037.slice/crio-175712cb16ead78d1ecb2dfd2fadfc99a5d4f259d91207cbe6fa6dea0379a5c2 WatchSource:0}: Error finding container 175712cb16ead78d1ecb2dfd2fadfc99a5d4f259d91207cbe6fa6dea0379a5c2: Status 404 returned error can't find the container with id 175712cb16ead78d1ecb2dfd2fadfc99a5d4f259d91207cbe6fa6dea0379a5c2 Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.682351 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" event={"ID":"5378b261-2167-4eae-a672-0dd816a99a18","Type":"ContainerStarted","Data":"525030698e46ee18d89cda86a3cf3c253dd290a2f6aa53292c7225dd4795c5ac"} Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.696288 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d8b96c4f5-87qzw" event={"ID":"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292","Type":"ContainerStarted","Data":"1ed4e24385f11da81a80b04cc16c0aff37583338fcd11accedb72128c30a4a27"} Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.739370 4792 generic.go:334] "Generic (PLEG): container finished" podID="aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" containerID="6daf7967b4fe0584f3736dc5540a31f35c8a54c8425775b5dee887bf53d516e3" exitCode=0 Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.739490 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-lgxjn" event={"ID":"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb","Type":"ContainerDied","Data":"6daf7967b4fe0584f3736dc5540a31f35c8a54c8425775b5dee887bf53d516e3"} Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.804324 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerID="1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598" exitCode=0 Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.804476 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerDied","Data":"1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598"} Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.819808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d596b878-mgtbk" event={"ID":"ac4b1400-97cc-426c-9364-8d03b7c43037","Type":"ContainerStarted","Data":"175712cb16ead78d1ecb2dfd2fadfc99a5d4f259d91207cbe6fa6dea0379a5c2"} Mar 18 15:58:49 crc kubenswrapper[4792]: I0318 15:58:49.926145 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557","Type":"ContainerStarted","Data":"24c3e451888c9aae9218fe9a33b60d7d7854a843c7b629eeafa652da78a83921"} Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.283772 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.318678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhf25\" (UniqueName: \"kubernetes.io/projected/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-kube-api-access-hhf25\") pod \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.319591 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-sb\") pod \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.319651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-nb\") pod \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.319746 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-svc\") pod \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.319889 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-swift-storage-0\") pod \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.319914 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-config\") pod \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\" (UID: \"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb\") " Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.325170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-kube-api-access-hhf25" (OuterVolumeSpecName: "kube-api-access-hhf25") pod "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" (UID: "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb"). InnerVolumeSpecName "kube-api-access-hhf25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.373161 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-config" (OuterVolumeSpecName: "config") pod "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" (UID: "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.376169 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" (UID: "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.380573 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" (UID: "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.382059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" (UID: "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.423039 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.423266 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.423383 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.423449 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.423515 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhf25\" (UniqueName: \"kubernetes.io/projected/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-kube-api-access-hhf25\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.424424 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" (UID: "aeaa1087-1d8f-4041-a5c6-506f3f2a14fb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.507276 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.526446 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.796739 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.931526 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-lgxjn" event={"ID":"aeaa1087-1d8f-4041-a5c6-506f3f2a14fb","Type":"ContainerDied","Data":"adf33c0273c3969af48592a0f17620bc86901d9bc5550b95de58a0e9a1b9ef3b"} Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.931833 4792 scope.go:117] "RemoveContainer" containerID="6daf7967b4fe0584f3736dc5540a31f35c8a54c8425775b5dee887bf53d516e3" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.931960 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-lgxjn" Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.953834 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d596b878-mgtbk" event={"ID":"ac4b1400-97cc-426c-9364-8d03b7c43037","Type":"ContainerStarted","Data":"73746715c3f0afa1a30688cbdb37535ae431613983e623747b7d8f36c8eef984"} Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.983728 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557","Type":"ContainerStarted","Data":"e99c5656915b0bc8fad1f3cf92681aabbc27bb4893a1b5fb8674bd08fc4ca146"} Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.988798 4792 generic.go:334] "Generic (PLEG): container finished" podID="5378b261-2167-4eae-a672-0dd816a99a18" containerID="613e7bdff76f4f99cbb1c0643480e2a79b2d05f5cca546408dc3b877c1eda530" exitCode=0 Mar 18 15:58:50 crc kubenswrapper[4792]: I0318 15:58:50.988844 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" event={"ID":"5378b261-2167-4eae-a672-0dd816a99a18","Type":"ContainerDied","Data":"613e7bdff76f4f99cbb1c0643480e2a79b2d05f5cca546408dc3b877c1eda530"} Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.107603 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-lgxjn"] Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.148031 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-774db89647-lgxjn"] Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.340704 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76f8d6944f-p946j"] Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.340955 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76f8d6944f-p946j" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-api" containerID="cri-o://5d74754be1babe43267f5a09ee47baddd1161593a188bc0c08032c6fe04290c7" gracePeriod=30 Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.341644 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76f8d6944f-p946j" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-httpd" containerID="cri-o://e20d47afd4e19084054ae7e9e5d525fe420653b2bb7cd732cee5b99c236ea119" gracePeriod=30 Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.405952 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-76f8d6944f-p946j" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.202:9696/\": read tcp 10.217.0.2:39406->10.217.0.202:9696: read: connection reset by peer" Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.430641 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f66cfdb67-9fjs4"] Mar 18 15:58:51 crc kubenswrapper[4792]: E0318 15:58:51.431351 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" containerName="init" Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.431402 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" containerName="init" Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.431901 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" containerName="init" Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.433318 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:51 crc kubenswrapper[4792]: I0318 15:58:51.490045 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f66cfdb67-9fjs4"] Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.580787 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-config\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.580848 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-ovndb-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.580897 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlfhs\" (UniqueName: \"kubernetes.io/projected/21384247-2994-41b5-9e8e-10f0e31e5ea9-kube-api-access-qlfhs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.581099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-public-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.581362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-internal-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.581463 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-httpd-config\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.581563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-combined-ca-bundle\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.684150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-httpd-config\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.684232 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-combined-ca-bundle\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.684368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-config\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.684404 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-ovndb-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.684438 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlfhs\" (UniqueName: \"kubernetes.io/projected/21384247-2994-41b5-9e8e-10f0e31e5ea9-kube-api-access-qlfhs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.684487 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-public-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.684555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-internal-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.690603 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-internal-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.690615 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-httpd-config\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.693667 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-combined-ca-bundle\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.693812 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-config\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.695555 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-ovndb-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.702078 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21384247-2994-41b5-9e8e-10f0e31e5ea9-public-tls-certs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.709316 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlfhs\" (UniqueName: \"kubernetes.io/projected/21384247-2994-41b5-9e8e-10f0e31e5ea9-kube-api-access-qlfhs\") pod \"neutron-5f66cfdb67-9fjs4\" (UID: \"21384247-2994-41b5-9e8e-10f0e31e5ea9\") " pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.768352 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:51.872845 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeaa1087-1d8f-4041-a5c6-506f3f2a14fb" path="/var/lib/kubelet/pods/aeaa1087-1d8f-4041-a5c6-506f3f2a14fb/volumes" Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:52.026216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d596b878-mgtbk" event={"ID":"ac4b1400-97cc-426c-9364-8d03b7c43037","Type":"ContainerStarted","Data":"aa7293a447c5e6f54bd0d92b6ce5fa0240e9fd7b3a244deab0151bf4a079ce0f"} Mar 18 15:58:52 crc kubenswrapper[4792]: I0318 15:58:52.028185 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d48439a9-65ab-4389-8776-80321175e016","Type":"ContainerStarted","Data":"aacbc3965f3a5c66280df102b47aa8f228e407ff5f203805656b555aed7e6b31"} Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.063946 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557","Type":"ContainerStarted","Data":"9da96e68f54a31c634cb57d0f931a08af976b5765b06f3bb35dabdd75922540e"} Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.064072 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api-log" containerID="cri-o://e99c5656915b0bc8fad1f3cf92681aabbc27bb4893a1b5fb8674bd08fc4ca146" gracePeriod=30 Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.064347 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.064379 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api" containerID="cri-o://9da96e68f54a31c634cb57d0f931a08af976b5765b06f3bb35dabdd75922540e" gracePeriod=30 Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.073232 4792 generic.go:334] "Generic (PLEG): container finished" podID="84428496-0cfa-4794-baaa-65b6350ec310" containerID="e20d47afd4e19084054ae7e9e5d525fe420653b2bb7cd732cee5b99c236ea119" exitCode=0 Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.073996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8d6944f-p946j" event={"ID":"84428496-0cfa-4794-baaa-65b6350ec310","Type":"ContainerDied","Data":"e20d47afd4e19084054ae7e9e5d525fe420653b2bb7cd732cee5b99c236ea119"} Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.074118 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.074252 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.100760 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.100733241 podStartE2EDuration="7.100733241s" podCreationTimestamp="2026-03-18 15:58:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:53.086197331 +0000 UTC m=+1481.955526268" watchObservedRunningTime="2026-03-18 15:58:53.100733241 +0000 UTC m=+1481.970062178" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.151695 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85d596b878-mgtbk" podStartSLOduration=6.151676713 podStartE2EDuration="6.151676713s" podCreationTimestamp="2026-03-18 15:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:53.112721531 +0000 UTC m=+1481.982050468" watchObservedRunningTime="2026-03-18 15:58:53.151676713 +0000 UTC m=+1482.021005650" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.211043 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-76f8d6944f-p946j" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.202:9696/\": dial tcp 10.217.0.202:9696: connect: connection refused" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.767223 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.767565 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:58:53 crc kubenswrapper[4792]: I0318 15:58:53.773748 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.053820 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.054224 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.170010 4792 generic.go:334] "Generic (PLEG): container finished" podID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerID="e99c5656915b0bc8fad1f3cf92681aabbc27bb4893a1b5fb8674bd08fc4ca146" exitCode=143 Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.171087 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557","Type":"ContainerDied","Data":"e99c5656915b0bc8fad1f3cf92681aabbc27bb4893a1b5fb8674bd08fc4ca146"} Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.350102 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.353914 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85d9bfc98-xffcv"] Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.357645 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.362649 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.365027 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85d9bfc98-xffcv"] Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.399490 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.408605 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8a710c-9e71-411c-b036-b4f01dc4d420-logs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.408739 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-public-tls-certs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.408775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-config-data-custom\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.408875 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-internal-tls-certs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.408939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvcm\" (UniqueName: \"kubernetes.io/projected/7d8a710c-9e71-411c-b036-b4f01dc4d420-kube-api-access-whvcm\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.409035 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-combined-ca-bundle\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.409069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-config-data\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.511277 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-public-tls-certs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.513028 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-config-data-custom\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.513099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-internal-tls-certs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.513181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvcm\" (UniqueName: \"kubernetes.io/projected/7d8a710c-9e71-411c-b036-b4f01dc4d420-kube-api-access-whvcm\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.513257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-combined-ca-bundle\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.513291 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-config-data\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.513391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8a710c-9e71-411c-b036-b4f01dc4d420-logs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.513875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8a710c-9e71-411c-b036-b4f01dc4d420-logs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.532292 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-config-data-custom\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.537420 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-combined-ca-bundle\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.538635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-public-tls-certs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.539930 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-config-data\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.541005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8a710c-9e71-411c-b036-b4f01dc4d420-internal-tls-certs\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.638792 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvcm\" (UniqueName: \"kubernetes.io/projected/7d8a710c-9e71-411c-b036-b4f01dc4d420-kube-api-access-whvcm\") pod \"barbican-api-85d9bfc98-xffcv\" (UID: \"7d8a710c-9e71-411c-b036-b4f01dc4d420\") " pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.719812 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:54 crc kubenswrapper[4792]: I0318 15:58:54.775030 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f66cfdb67-9fjs4"] Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.193274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" event={"ID":"442ae180-60ac-4d2c-92eb-b9a823ba74a9","Type":"ContainerStarted","Data":"917e885e170685468f0f0b46017617672cdd4e95d7f397377cf1e7b27c54276b"} Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.193833 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" event={"ID":"442ae180-60ac-4d2c-92eb-b9a823ba74a9","Type":"ContainerStarted","Data":"f69f3ce845c65345151fae645fa613f4e79455d85eb6ee6fd1fa448422304b1c"} Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.198024 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" event={"ID":"5378b261-2167-4eae-a672-0dd816a99a18","Type":"ContainerStarted","Data":"a795f848d128a13d931b826d0d66939ba57886ba1e016d95571a680f31e40671"} Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.198558 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.206946 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f66cfdb67-9fjs4" event={"ID":"21384247-2994-41b5-9e8e-10f0e31e5ea9","Type":"ContainerStarted","Data":"715adc7267f3b32fc37512b8ccc73c488d6a66b7065a5dc4956a9281fad252ef"} Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.227217 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d8b96c4f5-87qzw" event={"ID":"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292","Type":"ContainerStarted","Data":"7cb71fd7a359a1489f55fa786b5374da5359396741c677a08f7eb3bae3e340a4"} Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.228917 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7bf8bcc748-g9kq5" podStartSLOduration=4.150803327 podStartE2EDuration="9.228905646s" podCreationTimestamp="2026-03-18 15:58:46 +0000 UTC" firstStartedPulling="2026-03-18 15:58:48.570029541 +0000 UTC m=+1477.439358478" lastFinishedPulling="2026-03-18 15:58:53.64813186 +0000 UTC m=+1482.517460797" observedRunningTime="2026-03-18 15:58:55.224958611 +0000 UTC m=+1484.094287548" watchObservedRunningTime="2026-03-18 15:58:55.228905646 +0000 UTC m=+1484.098234583" Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.237015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d48439a9-65ab-4389-8776-80321175e016","Type":"ContainerStarted","Data":"5a0655d6c84c5b0595b5757f14186503d77a1fc851d7b3a4bf67dbb6022592bf"} Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.298989 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.340624671 podStartE2EDuration="9.298946472s" podCreationTimestamp="2026-03-18 15:58:46 +0000 UTC" firstStartedPulling="2026-03-18 15:58:48.133129628 +0000 UTC m=+1477.002458565" lastFinishedPulling="2026-03-18 15:58:49.091451429 +0000 UTC m=+1477.960780366" observedRunningTime="2026-03-18 15:58:55.288820071 +0000 UTC m=+1484.158148998" watchObservedRunningTime="2026-03-18 15:58:55.298946472 +0000 UTC m=+1484.168275409" Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.301164 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" podStartSLOduration=8.301153232 podStartE2EDuration="8.301153232s" podCreationTimestamp="2026-03-18 15:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:55.264414559 +0000 UTC m=+1484.133743516" watchObservedRunningTime="2026-03-18 15:58:55.301153232 +0000 UTC m=+1484.170482169" Mar 18 15:58:55 crc kubenswrapper[4792]: I0318 15:58:55.526421 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85d9bfc98-xffcv"] Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.249713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d8b96c4f5-87qzw" event={"ID":"fbcb88bd-9c5a-4e8f-bbe2-0109d7751292","Type":"ContainerStarted","Data":"03bb48b61346d48a15e0b2995f94dd7fc1b67f2b39e5a5e46568075237f88e31"} Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.251341 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d9bfc98-xffcv" event={"ID":"7d8a710c-9e71-411c-b036-b4f01dc4d420","Type":"ContainerStarted","Data":"84f8ec337f69f2512a845e191921e2a7b8b0a6907c5bbd7151ce449055ff1928"} Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.251378 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d9bfc98-xffcv" event={"ID":"7d8a710c-9e71-411c-b036-b4f01dc4d420","Type":"ContainerStarted","Data":"92eab03ce9ff7d86e6e9d66db2c389143b2901bb67d80423c323b9b5f0cbe514"} Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.251399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d9bfc98-xffcv" event={"ID":"7d8a710c-9e71-411c-b036-b4f01dc4d420","Type":"ContainerStarted","Data":"04cde56ab5383f51c36355003040e8b093b5124dbae9ef9fdb9803216e493c38"} Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.251438 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.252656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f66cfdb67-9fjs4" event={"ID":"21384247-2994-41b5-9e8e-10f0e31e5ea9","Type":"ContainerStarted","Data":"f9d428e747bd0ab908175a97fe582a2a35a037b4c81fcc7ff9bad3188281f153"} Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.252692 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f66cfdb67-9fjs4" event={"ID":"21384247-2994-41b5-9e8e-10f0e31e5ea9","Type":"ContainerStarted","Data":"e6cb123e72d482fdafd2f261740a70d204a87f30f60fc5a4559b265379bcb2a3"} Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.289052 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d8b96c4f5-87qzw" podStartSLOduration=5.5019263259999995 podStartE2EDuration="10.289028698s" podCreationTimestamp="2026-03-18 15:58:46 +0000 UTC" firstStartedPulling="2026-03-18 15:58:48.869042842 +0000 UTC m=+1477.738371779" lastFinishedPulling="2026-03-18 15:58:53.656145214 +0000 UTC m=+1482.525474151" observedRunningTime="2026-03-18 15:58:56.272272997 +0000 UTC m=+1485.141601944" watchObservedRunningTime="2026-03-18 15:58:56.289028698 +0000 UTC m=+1485.158357645" Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.300445 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85d9bfc98-xffcv" podStartSLOduration=2.300420108 podStartE2EDuration="2.300420108s" podCreationTimestamp="2026-03-18 15:58:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:56.289607066 +0000 UTC m=+1485.158936013" watchObservedRunningTime="2026-03-18 15:58:56.300420108 +0000 UTC m=+1485.169749045" Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.323568 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f66cfdb67-9fjs4" podStartSLOduration=5.32354535 podStartE2EDuration="5.32354535s" podCreationTimestamp="2026-03-18 15:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:58:56.309953689 +0000 UTC m=+1485.179282626" watchObservedRunningTime="2026-03-18 15:58:56.32354535 +0000 UTC m=+1485.192874287" Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.946106 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 15:58:56 crc kubenswrapper[4792]: I0318 15:58:56.947778 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.208:8080/\": dial tcp 10.217.0.208:8080: connect: connection refused" Mar 18 15:58:57 crc kubenswrapper[4792]: I0318 15:58:57.267745 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:58:57 crc kubenswrapper[4792]: I0318 15:58:57.268153 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:58:59 crc kubenswrapper[4792]: I0318 15:58:59.460441 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-85d596b878-mgtbk" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 15:58:59 crc kubenswrapper[4792]: I0318 15:58:59.848892 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:58:59 crc kubenswrapper[4792]: I0318 15:58:59.897591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:59:00 crc kubenswrapper[4792]: I0318 15:59:00.322171 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:59:00 crc kubenswrapper[4792]: I0318 15:59:00.322595 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:59:00 crc kubenswrapper[4792]: I0318 15:59:00.593461 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 15:59:02 crc kubenswrapper[4792]: I0318 15:59:02.166121 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 15:59:02 crc kubenswrapper[4792]: I0318 15:59:02.229533 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:59:02 crc kubenswrapper[4792]: I0318 15:59:02.318370 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="cinder-scheduler" containerID="cri-o://aacbc3965f3a5c66280df102b47aa8f228e407ff5f203805656b555aed7e6b31" gracePeriod=30 Mar 18 15:59:02 crc kubenswrapper[4792]: I0318 15:59:02.318439 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="probe" containerID="cri-o://5a0655d6c84c5b0595b5757f14186503d77a1fc851d7b3a4bf67dbb6022592bf" gracePeriod=30 Mar 18 15:59:03 crc kubenswrapper[4792]: I0318 15:59:03.062144 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:59:03 crc kubenswrapper[4792]: I0318 15:59:03.268302 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-5lpcm"] Mar 18 15:59:03 crc kubenswrapper[4792]: I0318 15:59:03.268587 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" podUID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerName="dnsmasq-dns" containerID="cri-o://7b33c3b1f993d9af75223d7aa0adaae51d47fce2167ad3a0b643b0af28fb7733" gracePeriod=10 Mar 18 15:59:03 crc kubenswrapper[4792]: I0318 15:59:03.353782 4792 generic.go:334] "Generic (PLEG): container finished" podID="d48439a9-65ab-4389-8776-80321175e016" containerID="5a0655d6c84c5b0595b5757f14186503d77a1fc851d7b3a4bf67dbb6022592bf" exitCode=0 Mar 18 15:59:03 crc kubenswrapper[4792]: I0318 15:59:03.354154 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d48439a9-65ab-4389-8776-80321175e016","Type":"ContainerDied","Data":"5a0655d6c84c5b0595b5757f14186503d77a1fc851d7b3a4bf67dbb6022592bf"} Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.426850 4792 generic.go:334] "Generic (PLEG): container finished" podID="d48439a9-65ab-4389-8776-80321175e016" containerID="aacbc3965f3a5c66280df102b47aa8f228e407ff5f203805656b555aed7e6b31" exitCode=0 Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.427271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d48439a9-65ab-4389-8776-80321175e016","Type":"ContainerDied","Data":"aacbc3965f3a5c66280df102b47aa8f228e407ff5f203805656b555aed7e6b31"} Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.439470 4792 generic.go:334] "Generic (PLEG): container finished" podID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerID="7b33c3b1f993d9af75223d7aa0adaae51d47fce2167ad3a0b643b0af28fb7733" exitCode=0 Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.439515 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" event={"ID":"314657e3-e425-4547-9bc0-34b6554bb0c9","Type":"ContainerDied","Data":"7b33c3b1f993d9af75223d7aa0adaae51d47fce2167ad3a0b643b0af28fb7733"} Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.678311 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.846252 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-swift-storage-0\") pod \"314657e3-e425-4547-9bc0-34b6554bb0c9\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.846583 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxtqd\" (UniqueName: \"kubernetes.io/projected/314657e3-e425-4547-9bc0-34b6554bb0c9-kube-api-access-cxtqd\") pod \"314657e3-e425-4547-9bc0-34b6554bb0c9\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.846628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-nb\") pod \"314657e3-e425-4547-9bc0-34b6554bb0c9\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.846766 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-config\") pod \"314657e3-e425-4547-9bc0-34b6554bb0c9\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.846833 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-svc\") pod \"314657e3-e425-4547-9bc0-34b6554bb0c9\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.846993 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-sb\") pod \"314657e3-e425-4547-9bc0-34b6554bb0c9\" (UID: \"314657e3-e425-4547-9bc0-34b6554bb0c9\") " Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.868583 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314657e3-e425-4547-9bc0-34b6554bb0c9-kube-api-access-cxtqd" (OuterVolumeSpecName: "kube-api-access-cxtqd") pod "314657e3-e425-4547-9bc0-34b6554bb0c9" (UID: "314657e3-e425-4547-9bc0-34b6554bb0c9"). InnerVolumeSpecName "kube-api-access-cxtqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.949930 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxtqd\" (UniqueName: \"kubernetes.io/projected/314657e3-e425-4547-9bc0-34b6554bb0c9-kube-api-access-cxtqd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:04 crc kubenswrapper[4792]: I0318 15:59:04.990673 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "314657e3-e425-4547-9bc0-34b6554bb0c9" (UID: "314657e3-e425-4547-9bc0-34b6554bb0c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.019767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "314657e3-e425-4547-9bc0-34b6554bb0c9" (UID: "314657e3-e425-4547-9bc0-34b6554bb0c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.045234 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.052565 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.052594 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.062558 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "314657e3-e425-4547-9bc0-34b6554bb0c9" (UID: "314657e3-e425-4547-9bc0-34b6554bb0c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.063480 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "314657e3-e425-4547-9bc0-34b6554bb0c9" (UID: "314657e3-e425-4547-9bc0-34b6554bb0c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.108712 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-config" (OuterVolumeSpecName: "config") pod "314657e3-e425-4547-9bc0-34b6554bb0c9" (UID: "314657e3-e425-4547-9bc0-34b6554bb0c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.154886 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-scripts\") pod \"d48439a9-65ab-4389-8776-80321175e016\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.155282 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2tnb\" (UniqueName: \"kubernetes.io/projected/d48439a9-65ab-4389-8776-80321175e016-kube-api-access-b2tnb\") pod \"d48439a9-65ab-4389-8776-80321175e016\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.155497 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48439a9-65ab-4389-8776-80321175e016-etc-machine-id\") pod \"d48439a9-65ab-4389-8776-80321175e016\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.155619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d48439a9-65ab-4389-8776-80321175e016-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d48439a9-65ab-4389-8776-80321175e016" (UID: "d48439a9-65ab-4389-8776-80321175e016"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.155861 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data-custom\") pod \"d48439a9-65ab-4389-8776-80321175e016\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.156009 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-combined-ca-bundle\") pod \"d48439a9-65ab-4389-8776-80321175e016\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.156392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data\") pod \"d48439a9-65ab-4389-8776-80321175e016\" (UID: \"d48439a9-65ab-4389-8776-80321175e016\") " Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.157674 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.157796 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48439a9-65ab-4389-8776-80321175e016-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.157994 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.158094 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314657e3-e425-4547-9bc0-34b6554bb0c9-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.162108 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48439a9-65ab-4389-8776-80321175e016-kube-api-access-b2tnb" (OuterVolumeSpecName: "kube-api-access-b2tnb") pod "d48439a9-65ab-4389-8776-80321175e016" (UID: "d48439a9-65ab-4389-8776-80321175e016"). InnerVolumeSpecName "kube-api-access-b2tnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.162764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-scripts" (OuterVolumeSpecName: "scripts") pod "d48439a9-65ab-4389-8776-80321175e016" (UID: "d48439a9-65ab-4389-8776-80321175e016"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.163284 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d48439a9-65ab-4389-8776-80321175e016" (UID: "d48439a9-65ab-4389-8776-80321175e016"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.262968 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.263325 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2tnb\" (UniqueName: \"kubernetes.io/projected/d48439a9-65ab-4389-8776-80321175e016-kube-api-access-b2tnb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.263400 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.268118 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d48439a9-65ab-4389-8776-80321175e016" (UID: "d48439a9-65ab-4389-8776-80321175e016"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.321182 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data" (OuterVolumeSpecName: "config-data") pod "d48439a9-65ab-4389-8776-80321175e016" (UID: "d48439a9-65ab-4389-8776-80321175e016"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.365764 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.365803 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48439a9-65ab-4389-8776-80321175e016-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.454460 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" event={"ID":"314657e3-e425-4547-9bc0-34b6554bb0c9","Type":"ContainerDied","Data":"1854bed556e91c77e1512750729d15e99aaf606c82010fcca10c11ce1e775e8a"} Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.454559 4792 scope.go:117] "RemoveContainer" containerID="7b33c3b1f993d9af75223d7aa0adaae51d47fce2167ad3a0b643b0af28fb7733" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.454829 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-5lpcm" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.461597 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d48439a9-65ab-4389-8776-80321175e016","Type":"ContainerDied","Data":"e137849d4ed0dc225d28d86f5ab5ed2ddd2d2e2eb0fa486a7cc55a182409066e"} Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.461671 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.549930 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.558136 4792 scope.go:117] "RemoveContainer" containerID="6f5e342bf67017c442a9bf121defb671b61b4e3a7126f9089a7c591ad96caddd" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.592114 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.625204 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:59:05 crc kubenswrapper[4792]: E0318 15:59:05.626047 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerName="dnsmasq-dns" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.626062 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerName="dnsmasq-dns" Mar 18 15:59:05 crc kubenswrapper[4792]: E0318 15:59:05.626078 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="cinder-scheduler" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.626084 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="cinder-scheduler" Mar 18 15:59:05 crc kubenswrapper[4792]: E0318 15:59:05.626093 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerName="init" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.626098 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerName="init" Mar 18 15:59:05 crc kubenswrapper[4792]: E0318 15:59:05.626127 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="probe" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.626133 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="probe" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.626356 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="probe" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.626369 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48439a9-65ab-4389-8776-80321175e016" containerName="cinder-scheduler" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.626376 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="314657e3-e425-4547-9bc0-34b6554bb0c9" containerName="dnsmasq-dns" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.638310 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.650843 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.651634 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-5lpcm"] Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.676372 4792 scope.go:117] "RemoveContainer" containerID="5a0655d6c84c5b0595b5757f14186503d77a1fc851d7b3a4bf67dbb6022592bf" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.676756 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-5lpcm"] Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.694469 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.760400 4792 scope.go:117] "RemoveContainer" containerID="aacbc3965f3a5c66280df102b47aa8f228e407ff5f203805656b555aed7e6b31" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.797507 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.797623 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-config-data\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.797673 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.797733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-scripts\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.797760 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.797880 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pzfc\" (UniqueName: \"kubernetes.io/projected/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-kube-api-access-4pzfc\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.867700 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314657e3-e425-4547-9bc0-34b6554bb0c9" path="/var/lib/kubelet/pods/314657e3-e425-4547-9bc0-34b6554bb0c9/volumes" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.868821 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48439a9-65ab-4389-8776-80321175e016" path="/var/lib/kubelet/pods/d48439a9-65ab-4389-8776-80321175e016/volumes" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.902586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pzfc\" (UniqueName: \"kubernetes.io/projected/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-kube-api-access-4pzfc\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.902661 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.902760 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-config-data\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.902815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.902916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-scripts\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.902986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.903234 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.909278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.909739 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-config-data\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.910437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-scripts\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.923082 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:05 crc kubenswrapper[4792]: I0318 15:59:05.923866 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pzfc\" (UniqueName: \"kubernetes.io/projected/d41b9217-24bd-4b7c-98f7-04ec8ca9bf89-kube-api-access-4pzfc\") pod \"cinder-scheduler-0\" (UID: \"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89\") " pod="openstack/cinder-scheduler-0" Mar 18 15:59:06 crc kubenswrapper[4792]: I0318 15:59:06.050188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 15:59:06 crc kubenswrapper[4792]: I0318 15:59:06.793904 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 15:59:06 crc kubenswrapper[4792]: I0318 15:59:06.953941 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:59:06 crc kubenswrapper[4792]: I0318 15:59:06.963745 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.435715 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hrjt"] Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.439179 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.449896 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hrjt"] Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.554681 4792 generic.go:334] "Generic (PLEG): container finished" podID="84428496-0cfa-4794-baaa-65b6350ec310" containerID="5d74754be1babe43267f5a09ee47baddd1161593a188bc0c08032c6fe04290c7" exitCode=0 Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.555163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8d6944f-p946j" event={"ID":"84428496-0cfa-4794-baaa-65b6350ec310","Type":"ContainerDied","Data":"5d74754be1babe43267f5a09ee47baddd1161593a188bc0c08032c6fe04290c7"} Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.565712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-utilities\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.565961 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwrp4\" (UniqueName: \"kubernetes.io/projected/82382aff-7beb-4e33-8a05-5f58f2d3b299-kube-api-access-vwrp4\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.566039 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-catalog-content\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.568986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89","Type":"ContainerStarted","Data":"f74aa8fdbc8540e4c32b972159612f6fb06766c1f72c41cdafb5f96107ce1d47"} Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.671586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-catalog-content\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.671726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-utilities\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.672029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwrp4\" (UniqueName: \"kubernetes.io/projected/82382aff-7beb-4e33-8a05-5f58f2d3b299-kube-api-access-vwrp4\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.672815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-utilities\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.673257 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-catalog-content\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.696426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwrp4\" (UniqueName: \"kubernetes.io/projected/82382aff-7beb-4e33-8a05-5f58f2d3b299-kube-api-access-vwrp4\") pod \"redhat-operators-8hrjt\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:07 crc kubenswrapper[4792]: I0318 15:59:07.811578 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.263719 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.391661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-ovndb-tls-certs\") pod \"84428496-0cfa-4794-baaa-65b6350ec310\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.391833 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-combined-ca-bundle\") pod \"84428496-0cfa-4794-baaa-65b6350ec310\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.392007 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-config\") pod \"84428496-0cfa-4794-baaa-65b6350ec310\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.392075 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-922sn\" (UniqueName: \"kubernetes.io/projected/84428496-0cfa-4794-baaa-65b6350ec310-kube-api-access-922sn\") pod \"84428496-0cfa-4794-baaa-65b6350ec310\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.392120 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-public-tls-certs\") pod \"84428496-0cfa-4794-baaa-65b6350ec310\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.392156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-httpd-config\") pod \"84428496-0cfa-4794-baaa-65b6350ec310\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.392207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-internal-tls-certs\") pod \"84428496-0cfa-4794-baaa-65b6350ec310\" (UID: \"84428496-0cfa-4794-baaa-65b6350ec310\") " Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.401108 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84428496-0cfa-4794-baaa-65b6350ec310-kube-api-access-922sn" (OuterVolumeSpecName: "kube-api-access-922sn") pod "84428496-0cfa-4794-baaa-65b6350ec310" (UID: "84428496-0cfa-4794-baaa-65b6350ec310"). InnerVolumeSpecName "kube-api-access-922sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.405321 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "84428496-0cfa-4794-baaa-65b6350ec310" (UID: "84428496-0cfa-4794-baaa-65b6350ec310"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.499182 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-922sn\" (UniqueName: \"kubernetes.io/projected/84428496-0cfa-4794-baaa-65b6350ec310-kube-api-access-922sn\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.499227 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.507335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84428496-0cfa-4794-baaa-65b6350ec310" (UID: "84428496-0cfa-4794-baaa-65b6350ec310"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.575310 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hrjt"] Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.585965 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.603101 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.659094 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "84428496-0cfa-4794-baaa-65b6350ec310" (UID: "84428496-0cfa-4794-baaa-65b6350ec310"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.687757 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8d6944f-p946j" event={"ID":"84428496-0cfa-4794-baaa-65b6350ec310","Type":"ContainerDied","Data":"dc2e566436b429754d43058be1569b413074b3231c8df5633ed98c8a4f08c19e"} Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.687818 4792 scope.go:117] "RemoveContainer" containerID="e20d47afd4e19084054ae7e9e5d525fe420653b2bb7cd732cee5b99c236ea119" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.688041 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8d6944f-p946j" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.694903 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89","Type":"ContainerStarted","Data":"341949ae4e0b6b9ca7fd9a86f851b16cae109de816ab7917c8bc28440310f62a"} Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.703229 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "84428496-0cfa-4794-baaa-65b6350ec310" (UID: "84428496-0cfa-4794-baaa-65b6350ec310"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.704787 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.704808 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.720732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-config" (OuterVolumeSpecName: "config") pod "84428496-0cfa-4794-baaa-65b6350ec310" (UID: "84428496-0cfa-4794-baaa-65b6350ec310"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.722403 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "84428496-0cfa-4794-baaa-65b6350ec310" (UID: "84428496-0cfa-4794-baaa-65b6350ec310"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.814530 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.814590 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/84428496-0cfa-4794-baaa-65b6350ec310-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.827433 4792 scope.go:117] "RemoveContainer" containerID="5d74754be1babe43267f5a09ee47baddd1161593a188bc0c08032c6fe04290c7" Mar 18 15:59:08 crc kubenswrapper[4792]: I0318 15:59:08.920934 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-79db488c56-wbmrk" Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.108368 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-746b5459b4-c8cg4"] Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.108841 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-746b5459b4-c8cg4" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-log" containerID="cri-o://1a7e86a6f62a9787b04ec1e69e1a5bc737094190167bec40078f24fb183d7bf2" gracePeriod=30 Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.108962 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-746b5459b4-c8cg4" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-api" containerID="cri-o://8a2a66e0e6d953baf012fd12252c65d6f4230c2b9665d95e82df84d33bf48d79" gracePeriod=30 Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.207994 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76f8d6944f-p946j"] Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.289405 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-76f8d6944f-p946j"] Mar 18 15:59:09 crc kubenswrapper[4792]: E0318 15:59:09.292060 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84428496_0cfa_4794_baaa_65b6350ec310.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82382aff_7beb_4e33_8a05_5f58f2d3b299.slice/crio-244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd22f1eb2_0bf5_463c_859d_d7dd50b11a70.slice/crio-conmon-1a7e86a6f62a9787b04ec1e69e1a5bc737094190167bec40078f24fb183d7bf2.scope\": RecentStats: unable to find data in memory cache]" Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.734182 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85d9bfc98-xffcv" podUID="7d8a710c-9e71-411c-b036-b4f01dc4d420" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.216:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.734201 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85d9bfc98-xffcv" podUID="7d8a710c-9e71-411c-b036-b4f01dc4d420" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.216:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.746912 4792 generic.go:334] "Generic (PLEG): container finished" podID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerID="244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24" exitCode=0 Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.747050 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hrjt" event={"ID":"82382aff-7beb-4e33-8a05-5f58f2d3b299","Type":"ContainerDied","Data":"244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24"} Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.747077 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hrjt" event={"ID":"82382aff-7beb-4e33-8a05-5f58f2d3b299","Type":"ContainerStarted","Data":"77d8afd4ba7670ab3c680cc27f3ba2449735ae906231a897f2ae7a9975c4534f"} Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.800297 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89","Type":"ContainerStarted","Data":"aa391d1ee665d6141c1c45fcd05bd30d3f87a3c0a4ff14753da552c9ee24402c"} Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.834136 4792 generic.go:334] "Generic (PLEG): container finished" podID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerID="1a7e86a6f62a9787b04ec1e69e1a5bc737094190167bec40078f24fb183d7bf2" exitCode=143 Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.835408 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-746b5459b4-c8cg4" event={"ID":"d22f1eb2-0bf5-463c-859d-d7dd50b11a70","Type":"ContainerDied","Data":"1a7e86a6f62a9787b04ec1e69e1a5bc737094190167bec40078f24fb183d7bf2"} Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.878329 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.878305597 podStartE2EDuration="4.878305597s" podCreationTimestamp="2026-03-18 15:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:09.875469137 +0000 UTC m=+1498.744798084" watchObservedRunningTime="2026-03-18 15:59:09.878305597 +0000 UTC m=+1498.747634534" Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.947719 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84428496-0cfa-4794-baaa-65b6350ec310" path="/var/lib/kubelet/pods/84428496-0cfa-4794-baaa-65b6350ec310/volumes" Mar 18 15:59:09 crc kubenswrapper[4792]: I0318 15:59:09.953177 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:59:10 crc kubenswrapper[4792]: I0318 15:59:10.459240 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-69977cc675-l62x5" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.051307 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.083895 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 15:59:11 crc kubenswrapper[4792]: E0318 15:59:11.084467 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-api" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.084481 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-api" Mar 18 15:59:11 crc kubenswrapper[4792]: E0318 15:59:11.084532 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-httpd" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.084539 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-httpd" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.084797 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-httpd" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.084859 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="84428496-0cfa-4794-baaa-65b6350ec310" containerName="neutron-api" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.085809 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.087903 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.088462 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jfh4x" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.089572 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.097142 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.223572 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zqq\" (UniqueName: \"kubernetes.io/projected/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-kube-api-access-k7zqq\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.223645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.223720 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.223877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-openstack-config\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.326314 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-openstack-config\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.326480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zqq\" (UniqueName: \"kubernetes.io/projected/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-kube-api-access-k7zqq\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.326522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.326594 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.327298 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-openstack-config\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.332501 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.347348 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.355440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zqq\" (UniqueName: \"kubernetes.io/projected/2dfd536a-310d-4039-a397-2bcdcdc0c2c2-kube-api-access-k7zqq\") pod \"openstackclient\" (UID: \"2dfd536a-310d-4039-a397-2bcdcdc0c2c2\") " pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.405723 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 15:59:11 crc kubenswrapper[4792]: I0318 15:59:11.874360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hrjt" event={"ID":"82382aff-7beb-4e33-8a05-5f58f2d3b299","Type":"ContainerStarted","Data":"7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea"} Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.018606 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 15:59:12 crc kubenswrapper[4792]: W0318 15:59:12.019331 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dfd536a_310d_4039_a397_2bcdcdc0c2c2.slice/crio-47160274ffbed13c05ccf8f3f2e71b973b05e4a91e87b94596d7955a98768af6 WatchSource:0}: Error finding container 47160274ffbed13c05ccf8f3f2e71b973b05e4a91e87b94596d7955a98768af6: Status 404 returned error can't find the container with id 47160274ffbed13c05ccf8f3f2e71b973b05e4a91e87b94596d7955a98768af6 Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.527261 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.210:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.590281 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85d9bfc98-xffcv" Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.689379 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85d596b878-mgtbk"] Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.689660 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85d596b878-mgtbk" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api-log" containerID="cri-o://73746715c3f0afa1a30688cbdb37535ae431613983e623747b7d8f36c8eef984" gracePeriod=30 Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.690301 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85d596b878-mgtbk" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api" containerID="cri-o://aa7293a447c5e6f54bd0d92b6ce5fa0240e9fd7b3a244deab0151bf4a079ce0f" gracePeriod=30 Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.922948 4792 generic.go:334] "Generic (PLEG): container finished" podID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerID="8a2a66e0e6d953baf012fd12252c65d6f4230c2b9665d95e82df84d33bf48d79" exitCode=0 Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.923036 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-746b5459b4-c8cg4" event={"ID":"d22f1eb2-0bf5-463c-859d-d7dd50b11a70","Type":"ContainerDied","Data":"8a2a66e0e6d953baf012fd12252c65d6f4230c2b9665d95e82df84d33bf48d79"} Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.929064 4792 generic.go:334] "Generic (PLEG): container finished" podID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerID="73746715c3f0afa1a30688cbdb37535ae431613983e623747b7d8f36c8eef984" exitCode=143 Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.929166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d596b878-mgtbk" event={"ID":"ac4b1400-97cc-426c-9364-8d03b7c43037","Type":"ContainerDied","Data":"73746715c3f0afa1a30688cbdb37535ae431613983e623747b7d8f36c8eef984"} Mar 18 15:59:12 crc kubenswrapper[4792]: I0318 15:59:12.957831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2dfd536a-310d-4039-a397-2bcdcdc0c2c2","Type":"ContainerStarted","Data":"47160274ffbed13c05ccf8f3f2e71b973b05e4a91e87b94596d7955a98768af6"} Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.599865 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.611042 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-config-data\") pod \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.611207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-public-tls-certs\") pod \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.611303 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-logs\") pod \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.611427 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4542g\" (UniqueName: \"kubernetes.io/projected/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-kube-api-access-4542g\") pod \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.611466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-internal-tls-certs\") pod \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.611504 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-scripts\") pod \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.611543 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-combined-ca-bundle\") pod \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\" (UID: \"d22f1eb2-0bf5-463c-859d-d7dd50b11a70\") " Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.612117 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-logs" (OuterVolumeSpecName: "logs") pod "d22f1eb2-0bf5-463c-859d-d7dd50b11a70" (UID: "d22f1eb2-0bf5-463c-859d-d7dd50b11a70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.620926 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-scripts" (OuterVolumeSpecName: "scripts") pod "d22f1eb2-0bf5-463c-859d-d7dd50b11a70" (UID: "d22f1eb2-0bf5-463c-859d-d7dd50b11a70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.637250 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-kube-api-access-4542g" (OuterVolumeSpecName: "kube-api-access-4542g") pod "d22f1eb2-0bf5-463c-859d-d7dd50b11a70" (UID: "d22f1eb2-0bf5-463c-859d-d7dd50b11a70"). InnerVolumeSpecName "kube-api-access-4542g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.696164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d22f1eb2-0bf5-463c-859d-d7dd50b11a70" (UID: "d22f1eb2-0bf5-463c-859d-d7dd50b11a70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.715026 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.715064 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4542g\" (UniqueName: \"kubernetes.io/projected/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-kube-api-access-4542g\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.715078 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.715088 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.776253 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-config-data" (OuterVolumeSpecName: "config-data") pod "d22f1eb2-0bf5-463c-859d-d7dd50b11a70" (UID: "d22f1eb2-0bf5-463c-859d-d7dd50b11a70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.812610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d22f1eb2-0bf5-463c-859d-d7dd50b11a70" (UID: "d22f1eb2-0bf5-463c-859d-d7dd50b11a70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.814226 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d22f1eb2-0bf5-463c-859d-d7dd50b11a70" (UID: "d22f1eb2-0bf5-463c-859d-d7dd50b11a70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.819117 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.819179 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.819194 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22f1eb2-0bf5-463c-859d-d7dd50b11a70-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.970074 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-746b5459b4-c8cg4" event={"ID":"d22f1eb2-0bf5-463c-859d-d7dd50b11a70","Type":"ContainerDied","Data":"a81b893557361b5cc2a7a3cceb7b3e34d33bcb07d06a4567a75970dcaf113370"} Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.970124 4792 scope.go:117] "RemoveContainer" containerID="8a2a66e0e6d953baf012fd12252c65d6f4230c2b9665d95e82df84d33bf48d79" Mar 18 15:59:13 crc kubenswrapper[4792]: I0318 15:59:13.970283 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-746b5459b4-c8cg4" Mar 18 15:59:14 crc kubenswrapper[4792]: I0318 15:59:14.000916 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-746b5459b4-c8cg4"] Mar 18 15:59:14 crc kubenswrapper[4792]: I0318 15:59:14.021334 4792 scope.go:117] "RemoveContainer" containerID="1a7e86a6f62a9787b04ec1e69e1a5bc737094190167bec40078f24fb183d7bf2" Mar 18 15:59:14 crc kubenswrapper[4792]: I0318 15:59:14.023359 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-746b5459b4-c8cg4"] Mar 18 15:59:15 crc kubenswrapper[4792]: I0318 15:59:15.871690 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" path="/var/lib/kubelet/pods/d22f1eb2-0bf5-463c-859d-d7dd50b11a70/volumes" Mar 18 15:59:15 crc kubenswrapper[4792]: I0318 15:59:15.895959 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85d596b878-mgtbk" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.214:9311/healthcheck\": read tcp 10.217.0.2:59378->10.217.0.214:9311: read: connection reset by peer" Mar 18 15:59:15 crc kubenswrapper[4792]: I0318 15:59:15.896213 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-85d596b878-mgtbk" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.214:9311/healthcheck\": read tcp 10.217.0.2:59390->10.217.0.214:9311: read: connection reset by peer" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.004773 4792 generic.go:334] "Generic (PLEG): container finished" podID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerID="aa7293a447c5e6f54bd0d92b6ce5fa0240e9fd7b3a244deab0151bf4a079ce0f" exitCode=0 Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.004873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d596b878-mgtbk" event={"ID":"ac4b1400-97cc-426c-9364-8d03b7c43037","Type":"ContainerDied","Data":"aa7293a447c5e6f54bd0d92b6ce5fa0240e9fd7b3a244deab0151bf4a079ce0f"} Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.012058 4792 generic.go:334] "Generic (PLEG): container finished" podID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerID="7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea" exitCode=0 Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.012102 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hrjt" event={"ID":"82382aff-7beb-4e33-8a05-5f58f2d3b299","Type":"ContainerDied","Data":"7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea"} Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.464425 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.538644 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.695409 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data-custom\") pod \"ac4b1400-97cc-426c-9364-8d03b7c43037\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.695484 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfhjp\" (UniqueName: \"kubernetes.io/projected/ac4b1400-97cc-426c-9364-8d03b7c43037-kube-api-access-pfhjp\") pod \"ac4b1400-97cc-426c-9364-8d03b7c43037\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.695513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data\") pod \"ac4b1400-97cc-426c-9364-8d03b7c43037\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.695573 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-combined-ca-bundle\") pod \"ac4b1400-97cc-426c-9364-8d03b7c43037\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.695637 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4b1400-97cc-426c-9364-8d03b7c43037-logs\") pod \"ac4b1400-97cc-426c-9364-8d03b7c43037\" (UID: \"ac4b1400-97cc-426c-9364-8d03b7c43037\") " Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.698265 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4b1400-97cc-426c-9364-8d03b7c43037-logs" (OuterVolumeSpecName: "logs") pod "ac4b1400-97cc-426c-9364-8d03b7c43037" (UID: "ac4b1400-97cc-426c-9364-8d03b7c43037"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.709168 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac4b1400-97cc-426c-9364-8d03b7c43037-kube-api-access-pfhjp" (OuterVolumeSpecName: "kube-api-access-pfhjp") pod "ac4b1400-97cc-426c-9364-8d03b7c43037" (UID: "ac4b1400-97cc-426c-9364-8d03b7c43037"). InnerVolumeSpecName "kube-api-access-pfhjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.728856 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ac4b1400-97cc-426c-9364-8d03b7c43037" (UID: "ac4b1400-97cc-426c-9364-8d03b7c43037"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.755255 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac4b1400-97cc-426c-9364-8d03b7c43037" (UID: "ac4b1400-97cc-426c-9364-8d03b7c43037"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.798850 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.798891 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfhjp\" (UniqueName: \"kubernetes.io/projected/ac4b1400-97cc-426c-9364-8d03b7c43037-kube-api-access-pfhjp\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.798905 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.798916 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4b1400-97cc-426c-9364-8d03b7c43037-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.829905 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data" (OuterVolumeSpecName: "config-data") pod "ac4b1400-97cc-426c-9364-8d03b7c43037" (UID: "ac4b1400-97cc-426c-9364-8d03b7c43037"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.900741 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4b1400-97cc-426c-9364-8d03b7c43037-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:16 crc kubenswrapper[4792]: I0318 15:59:16.908225 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.001665 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csr85\" (UniqueName: \"kubernetes.io/projected/bd1c490b-f876-4095-9ae8-8280d3ce02c9-kube-api-access-csr85\") pod \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.001723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-sg-core-conf-yaml\") pod \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.001862 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-log-httpd\") pod \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.001893 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-run-httpd\") pod \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.001938 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-config-data\") pod \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.002046 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-combined-ca-bundle\") pod \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.002104 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-scripts\") pod \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\" (UID: \"bd1c490b-f876-4095-9ae8-8280d3ce02c9\") " Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.002191 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd1c490b-f876-4095-9ae8-8280d3ce02c9" (UID: "bd1c490b-f876-4095-9ae8-8280d3ce02c9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.003329 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.006076 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd1c490b-f876-4095-9ae8-8280d3ce02c9" (UID: "bd1c490b-f876-4095-9ae8-8280d3ce02c9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.012841 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-scripts" (OuterVolumeSpecName: "scripts") pod "bd1c490b-f876-4095-9ae8-8280d3ce02c9" (UID: "bd1c490b-f876-4095-9ae8-8280d3ce02c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.017919 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1c490b-f876-4095-9ae8-8280d3ce02c9-kube-api-access-csr85" (OuterVolumeSpecName: "kube-api-access-csr85") pod "bd1c490b-f876-4095-9ae8-8280d3ce02c9" (UID: "bd1c490b-f876-4095-9ae8-8280d3ce02c9"). InnerVolumeSpecName "kube-api-access-csr85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.088843 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerID="12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96" exitCode=137 Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.088933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerDied","Data":"12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96"} Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.088959 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd1c490b-f876-4095-9ae8-8280d3ce02c9","Type":"ContainerDied","Data":"2bbf0f2ed1e9202e0a37c40e5ad513d0305fabbcc3f333d50796855e3fa1439b"} Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.088992 4792 scope.go:117] "RemoveContainer" containerID="12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.089140 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.112942 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.113004 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csr85\" (UniqueName: \"kubernetes.io/projected/bd1c490b-f876-4095-9ae8-8280d3ce02c9-kube-api-access-csr85\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.113017 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd1c490b-f876-4095-9ae8-8280d3ce02c9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.142114 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd1c490b-f876-4095-9ae8-8280d3ce02c9" (UID: "bd1c490b-f876-4095-9ae8-8280d3ce02c9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.148155 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85d596b878-mgtbk" event={"ID":"ac4b1400-97cc-426c-9364-8d03b7c43037","Type":"ContainerDied","Data":"175712cb16ead78d1ecb2dfd2fadfc99a5d4f259d91207cbe6fa6dea0379a5c2"} Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.148260 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85d596b878-mgtbk" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.185825 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd1c490b-f876-4095-9ae8-8280d3ce02c9" (UID: "bd1c490b-f876-4095-9ae8-8280d3ce02c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.216850 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.216890 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.246127 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-config-data" (OuterVolumeSpecName: "config-data") pod "bd1c490b-f876-4095-9ae8-8280d3ce02c9" (UID: "bd1c490b-f876-4095-9ae8-8280d3ce02c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.323124 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1c490b-f876-4095-9ae8-8280d3ce02c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.351095 4792 scope.go:117] "RemoveContainer" containerID="2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.364069 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85d596b878-mgtbk"] Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.378225 4792 scope.go:117] "RemoveContainer" containerID="1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.382654 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-85d596b878-mgtbk"] Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.429443 4792 scope.go:117] "RemoveContainer" containerID="12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.430898 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96\": container with ID starting with 12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96 not found: ID does not exist" containerID="12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.430939 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96"} err="failed to get container status \"12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96\": rpc error: code = NotFound desc = could not find container \"12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96\": container with ID starting with 12ab24f82f23609f4f77bc13d10694efd071d9ebe016fb595e1061d116f7de96 not found: ID does not exist" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.430987 4792 scope.go:117] "RemoveContainer" containerID="2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.431624 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12\": container with ID starting with 2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12 not found: ID does not exist" containerID="2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.431663 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12"} err="failed to get container status \"2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12\": rpc error: code = NotFound desc = could not find container \"2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12\": container with ID starting with 2a954ce27d2ba612ffffabd81a84abaaf3051a9d7b8a50be2a7bde96b40e4a12 not found: ID does not exist" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.431697 4792 scope.go:117] "RemoveContainer" containerID="1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.432053 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598\": container with ID starting with 1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598 not found: ID does not exist" containerID="1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.432075 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598"} err="failed to get container status \"1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598\": rpc error: code = NotFound desc = could not find container \"1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598\": container with ID starting with 1e7aa1839955d6a7425dead48965b9ad7ac0a67d1cfc43fbd98a8b68cac37598 not found: ID does not exist" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.432090 4792 scope.go:117] "RemoveContainer" containerID="aa7293a447c5e6f54bd0d92b6ce5fa0240e9fd7b3a244deab0151bf4a079ce0f" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.476166 4792 scope.go:117] "RemoveContainer" containerID="73746715c3f0afa1a30688cbdb37535ae431613983e623747b7d8f36c8eef984" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.477179 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.490433 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.568495 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.571222 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-api" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.571264 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-api" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.571328 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.571339 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.571365 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api-log" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.571373 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api-log" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.571409 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="proxy-httpd" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.571418 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="proxy-httpd" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.571442 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="ceilometer-notification-agent" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.571451 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="ceilometer-notification-agent" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.571466 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-log" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.571476 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-log" Mar 18 15:59:17 crc kubenswrapper[4792]: E0318 15:59:17.571493 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="sg-core" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.571500 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="sg-core" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.573421 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="ceilometer-notification-agent" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.573451 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="sg-core" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.573470 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.573491 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-api" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.573522 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" containerName="proxy-httpd" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.573541 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" containerName="barbican-api-log" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.573585 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22f1eb2-0bf5-463c-859d-d7dd50b11a70" containerName="placement-log" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.597438 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.603095 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.624567 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.650730 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.782645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.782792 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnt4v\" (UniqueName: \"kubernetes.io/projected/a600353b-7ea9-490d-965b-ff93147642c4-kube-api-access-hnt4v\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.782848 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-log-httpd\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.782901 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.782958 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-config-data\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.783022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-scripts\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.783046 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-run-httpd\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.871908 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac4b1400-97cc-426c-9364-8d03b7c43037" path="/var/lib/kubelet/pods/ac4b1400-97cc-426c-9364-8d03b7c43037/volumes" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.872724 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1c490b-f876-4095-9ae8-8280d3ce02c9" path="/var/lib/kubelet/pods/bd1c490b-f876-4095-9ae8-8280d3ce02c9/volumes" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.884965 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnt4v\" (UniqueName: \"kubernetes.io/projected/a600353b-7ea9-490d-965b-ff93147642c4-kube-api-access-hnt4v\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.885358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-log-httpd\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.885419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.885492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-config-data\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.885539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-scripts\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.885586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-run-httpd\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.885675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.887652 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-log-httpd\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.887995 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-run-httpd\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.891592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.891796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-scripts\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.892894 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-config-data\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.893836 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.907569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnt4v\" (UniqueName: \"kubernetes.io/projected/a600353b-7ea9-490d-965b-ff93147642c4-kube-api-access-hnt4v\") pod \"ceilometer-0\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " pod="openstack/ceilometer-0" Mar 18 15:59:17 crc kubenswrapper[4792]: I0318 15:59:17.956284 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 15:59:18 crc kubenswrapper[4792]: I0318 15:59:18.177005 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hrjt" event={"ID":"82382aff-7beb-4e33-8a05-5f58f2d3b299","Type":"ContainerStarted","Data":"319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2"} Mar 18 15:59:18 crc kubenswrapper[4792]: I0318 15:59:18.205644 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hrjt" podStartSLOduration=4.136922529 podStartE2EDuration="11.205622869s" podCreationTimestamp="2026-03-18 15:59:07 +0000 UTC" firstStartedPulling="2026-03-18 15:59:09.752070893 +0000 UTC m=+1498.621399840" lastFinishedPulling="2026-03-18 15:59:16.820771243 +0000 UTC m=+1505.690100180" observedRunningTime="2026-03-18 15:59:18.201643094 +0000 UTC m=+1507.070972031" watchObservedRunningTime="2026-03-18 15:59:18.205622869 +0000 UTC m=+1507.074951806" Mar 18 15:59:18 crc kubenswrapper[4792]: I0318 15:59:18.749751 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.107370 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.645780 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7984d74779-bxqk7"] Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.648929 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.654423 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.654684 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.654903 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.727732 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7984d74779-bxqk7"] Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.843725 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-config-data\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.843769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-log-httpd\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.843790 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-run-httpd\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.843841 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khgjb\" (UniqueName: \"kubernetes.io/projected/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-kube-api-access-khgjb\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.843893 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-combined-ca-bundle\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.843927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-etc-swift\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.843963 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-internal-tls-certs\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.844041 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-public-tls-certs\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-config-data\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946176 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-log-httpd\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-run-httpd\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946247 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khgjb\" (UniqueName: \"kubernetes.io/projected/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-kube-api-access-khgjb\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946303 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-combined-ca-bundle\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946355 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-etc-swift\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-internal-tls-certs\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.946490 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-public-tls-certs\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.952841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-public-tls-certs\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.956621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-config-data\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.956965 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-log-httpd\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.957255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-run-httpd\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.962497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-combined-ca-bundle\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.969794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-internal-tls-certs\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:19 crc kubenswrapper[4792]: I0318 15:59:19.970637 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-etc-swift\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:20 crc kubenswrapper[4792]: I0318 15:59:20.000135 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khgjb\" (UniqueName: \"kubernetes.io/projected/e19c85aa-c5a1-4d0d-99ff-cc9283e5252f-kube-api-access-khgjb\") pod \"swift-proxy-7984d74779-bxqk7\" (UID: \"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f\") " pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:20 crc kubenswrapper[4792]: I0318 15:59:20.005864 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:21 crc kubenswrapper[4792]: I0318 15:59:21.806363 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f66cfdb67-9fjs4" Mar 18 15:59:21 crc kubenswrapper[4792]: I0318 15:59:21.918794 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56b66ddf48-4mkl2"] Mar 18 15:59:21 crc kubenswrapper[4792]: I0318 15:59:21.919289 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56b66ddf48-4mkl2" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-api" containerID="cri-o://a3fc31e388b6c2a17bc5a418f4aee15bc39270924b3f2af99fdb44f8479a617e" gracePeriod=30 Mar 18 15:59:21 crc kubenswrapper[4792]: I0318 15:59:21.919536 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56b66ddf48-4mkl2" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-httpd" containerID="cri-o://e83dd235cb7655cc32dc1b455f73a87297d7a65837faf67dec7c6873f58184f1" gracePeriod=30 Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.456902 4792 generic.go:334] "Generic (PLEG): container finished" podID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerID="e83dd235cb7655cc32dc1b455f73a87297d7a65837faf67dec7c6873f58184f1" exitCode=0 Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.456959 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b66ddf48-4mkl2" event={"ID":"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5","Type":"ContainerDied","Data":"e83dd235cb7655cc32dc1b455f73a87297d7a65837faf67dec7c6873f58184f1"} Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.842268 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7llzh"] Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.844078 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.857624 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7llzh"] Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.932076 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-76vww"] Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.933593 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.958499 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-76vww"] Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.976570 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292e5bc9-bca8-481a-99a8-512e17be912f-operator-scripts\") pod \"nova-api-db-create-7llzh\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:22 crc kubenswrapper[4792]: I0318 15:59:22.976690 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4cj4\" (UniqueName: \"kubernetes.io/projected/292e5bc9-bca8-481a-99a8-512e17be912f-kube-api-access-g4cj4\") pod \"nova-api-db-create-7llzh\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.001141 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3c34-account-create-update-r5zbr"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.004351 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.008186 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.021001 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3c34-account-create-update-r5zbr"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.082937 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg756\" (UniqueName: \"kubernetes.io/projected/971cd0c7-f215-44c4-bb0f-8930af5c49c5-kube-api-access-mg756\") pod \"nova-cell0-db-create-76vww\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.084309 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292e5bc9-bca8-481a-99a8-512e17be912f-operator-scripts\") pod \"nova-api-db-create-7llzh\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.084520 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4cj4\" (UniqueName: \"kubernetes.io/projected/292e5bc9-bca8-481a-99a8-512e17be912f-kube-api-access-g4cj4\") pod \"nova-api-db-create-7llzh\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.084580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/971cd0c7-f215-44c4-bb0f-8930af5c49c5-operator-scripts\") pod \"nova-cell0-db-create-76vww\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.085460 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292e5bc9-bca8-481a-99a8-512e17be912f-operator-scripts\") pod \"nova-api-db-create-7llzh\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.138167 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4cj4\" (UniqueName: \"kubernetes.io/projected/292e5bc9-bca8-481a-99a8-512e17be912f-kube-api-access-g4cj4\") pod \"nova-api-db-create-7llzh\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.161264 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mgrlp"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.163175 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.187424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg756\" (UniqueName: \"kubernetes.io/projected/971cd0c7-f215-44c4-bb0f-8930af5c49c5-kube-api-access-mg756\") pod \"nova-cell0-db-create-76vww\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.188052 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hq99\" (UniqueName: \"kubernetes.io/projected/39938593-0553-4883-976f-d412c79c5357-kube-api-access-8hq99\") pod \"nova-api-3c34-account-create-update-r5zbr\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.188212 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/971cd0c7-f215-44c4-bb0f-8930af5c49c5-operator-scripts\") pod \"nova-cell0-db-create-76vww\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.188434 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39938593-0553-4883-976f-d412c79c5357-operator-scripts\") pod \"nova-api-3c34-account-create-update-r5zbr\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.189099 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/971cd0c7-f215-44c4-bb0f-8930af5c49c5-operator-scripts\") pod \"nova-cell0-db-create-76vww\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.189313 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.198476 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mgrlp"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.238742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg756\" (UniqueName: \"kubernetes.io/projected/971cd0c7-f215-44c4-bb0f-8930af5c49c5-kube-api-access-mg756\") pod \"nova-cell0-db-create-76vww\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.240917 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e36b-account-create-update-mkc5t"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.242622 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.260449 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.266161 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.274480 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e36b-account-create-update-mkc5t"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.294126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hq99\" (UniqueName: \"kubernetes.io/projected/39938593-0553-4883-976f-d412c79c5357-kube-api-access-8hq99\") pod \"nova-api-3c34-account-create-update-r5zbr\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.294329 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpcbf\" (UniqueName: \"kubernetes.io/projected/31f349af-14ee-488d-ab71-3be43d3950ce-kube-api-access-dpcbf\") pod \"nova-cell1-db-create-mgrlp\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.294407 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39938593-0553-4883-976f-d412c79c5357-operator-scripts\") pod \"nova-api-3c34-account-create-update-r5zbr\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.294467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f349af-14ee-488d-ab71-3be43d3950ce-operator-scripts\") pod \"nova-cell1-db-create-mgrlp\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.295093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39938593-0553-4883-976f-d412c79c5357-operator-scripts\") pod \"nova-api-3c34-account-create-update-r5zbr\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.320592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hq99\" (UniqueName: \"kubernetes.io/projected/39938593-0553-4883-976f-d412c79c5357-kube-api-access-8hq99\") pod \"nova-api-3c34-account-create-update-r5zbr\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.337376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.340883 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-88f3-account-create-update-2s8l9"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.342571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.345878 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.374870 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88f3-account-create-update-2s8l9"] Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.396305 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqtl8\" (UniqueName: \"kubernetes.io/projected/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-kube-api-access-nqtl8\") pod \"nova-cell0-e36b-account-create-update-mkc5t\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.396368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f349af-14ee-488d-ab71-3be43d3950ce-operator-scripts\") pod \"nova-cell1-db-create-mgrlp\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.396502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-operator-scripts\") pod \"nova-cell0-e36b-account-create-update-mkc5t\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.396666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpcbf\" (UniqueName: \"kubernetes.io/projected/31f349af-14ee-488d-ab71-3be43d3950ce-kube-api-access-dpcbf\") pod \"nova-cell1-db-create-mgrlp\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.397786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f349af-14ee-488d-ab71-3be43d3950ce-operator-scripts\") pod \"nova-cell1-db-create-mgrlp\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.418223 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpcbf\" (UniqueName: \"kubernetes.io/projected/31f349af-14ee-488d-ab71-3be43d3950ce-kube-api-access-dpcbf\") pod \"nova-cell1-db-create-mgrlp\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.488271 4792 generic.go:334] "Generic (PLEG): container finished" podID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerID="9da96e68f54a31c634cb57d0f931a08af976b5765b06f3bb35dabdd75922540e" exitCode=137 Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.488322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557","Type":"ContainerDied","Data":"9da96e68f54a31c634cb57d0f931a08af976b5765b06f3bb35dabdd75922540e"} Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.499491 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-operator-scripts\") pod \"nova-cell0-e36b-account-create-update-mkc5t\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.498624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-operator-scripts\") pod \"nova-cell0-e36b-account-create-update-mkc5t\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.499651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f56bb52-48fd-427b-9524-2074c22df4b0-operator-scripts\") pod \"nova-cell1-88f3-account-create-update-2s8l9\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.499961 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqtl8\" (UniqueName: \"kubernetes.io/projected/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-kube-api-access-nqtl8\") pod \"nova-cell0-e36b-account-create-update-mkc5t\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.500141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snf8t\" (UniqueName: \"kubernetes.io/projected/1f56bb52-48fd-427b-9524-2074c22df4b0-kube-api-access-snf8t\") pod \"nova-cell1-88f3-account-create-update-2s8l9\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.505377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.533206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqtl8\" (UniqueName: \"kubernetes.io/projected/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-kube-api-access-nqtl8\") pod \"nova-cell0-e36b-account-create-update-mkc5t\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.604453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snf8t\" (UniqueName: \"kubernetes.io/projected/1f56bb52-48fd-427b-9524-2074c22df4b0-kube-api-access-snf8t\") pod \"nova-cell1-88f3-account-create-update-2s8l9\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.604583 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f56bb52-48fd-427b-9524-2074c22df4b0-operator-scripts\") pod \"nova-cell1-88f3-account-create-update-2s8l9\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.605387 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f56bb52-48fd-427b-9524-2074c22df4b0-operator-scripts\") pod \"nova-cell1-88f3-account-create-update-2s8l9\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.638666 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snf8t\" (UniqueName: \"kubernetes.io/projected/1f56bb52-48fd-427b-9524-2074c22df4b0-kube-api-access-snf8t\") pod \"nova-cell1-88f3-account-create-update-2s8l9\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.708792 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:23 crc kubenswrapper[4792]: I0318 15:59:23.793284 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:27 crc kubenswrapper[4792]: I0318 15:59:27.118669 4792 scope.go:117] "RemoveContainer" containerID="1766b4dff7fc2fd9360828e962dc090b101b583f2a29d299479f530191326bd7" Mar 18 15:59:27 crc kubenswrapper[4792]: I0318 15:59:27.481498 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.210:8776/healthcheck\": dial tcp 10.217.0.210:8776: connect: connection refused" Mar 18 15:59:27 crc kubenswrapper[4792]: I0318 15:59:27.540641 4792 generic.go:334] "Generic (PLEG): container finished" podID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerID="a3fc31e388b6c2a17bc5a418f4aee15bc39270924b3f2af99fdb44f8479a617e" exitCode=0 Mar 18 15:59:27 crc kubenswrapper[4792]: I0318 15:59:27.540690 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b66ddf48-4mkl2" event={"ID":"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5","Type":"ContainerDied","Data":"a3fc31e388b6c2a17bc5a418f4aee15bc39270924b3f2af99fdb44f8479a617e"} Mar 18 15:59:27 crc kubenswrapper[4792]: I0318 15:59:27.812386 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:27 crc kubenswrapper[4792]: I0318 15:59:27.812436 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:27 crc kubenswrapper[4792]: I0318 15:59:27.874741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:28 crc kubenswrapper[4792]: I0318 15:59:28.589041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerStarted","Data":"3ea2d0ed1ed0390cafeeee133e82a1ca5464f1f496941b27bb3125cecfe3c0a3"} Mar 18 15:59:28 crc kubenswrapper[4792]: I0318 15:59:28.707341 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:28 crc kubenswrapper[4792]: I0318 15:59:28.804361 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hrjt"] Mar 18 15:59:28 crc kubenswrapper[4792]: I0318 15:59:28.980884 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.021843 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.053316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-combined-ca-bundle\") pod \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.053485 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-scripts\") pod \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.053752 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-logs\") pod \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.053856 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data\") pod \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.053900 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlg6g\" (UniqueName: \"kubernetes.io/projected/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-kube-api-access-dlg6g\") pod \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.054105 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data-custom\") pod \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.054158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-logs" (OuterVolumeSpecName: "logs") pod "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" (UID: "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.054143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-etc-machine-id\") pod \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\" (UID: \"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.054854 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-logs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.054917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" (UID: "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.065930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-scripts" (OuterVolumeSpecName: "scripts") pod "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" (UID: "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.074609 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" (UID: "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.076163 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-kube-api-access-dlg6g" (OuterVolumeSpecName: "kube-api-access-dlg6g") pod "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" (UID: "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557"). InnerVolumeSpecName "kube-api-access-dlg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.089897 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-76vww"] Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.129013 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" (UID: "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.155954 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-combined-ca-bundle\") pod \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.156462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7h6p\" (UniqueName: \"kubernetes.io/projected/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-kube-api-access-p7h6p\") pod \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.156550 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-config\") pod \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.156710 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-ovndb-tls-certs\") pod \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.156734 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-httpd-config\") pod \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\" (UID: \"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5\") " Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.160485 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlg6g\" (UniqueName: \"kubernetes.io/projected/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-kube-api-access-dlg6g\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.160525 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.160536 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.160548 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.160559 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.166739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-kube-api-access-p7h6p" (OuterVolumeSpecName: "kube-api-access-p7h6p") pod "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" (UID: "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5"). InnerVolumeSpecName "kube-api-access-p7h6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.175292 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" (UID: "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.191398 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data" (OuterVolumeSpecName: "config-data") pod "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" (UID: "4519ccde-5d1c-49b1-9f7b-4af0d2f2b557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.264645 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7h6p\" (UniqueName: \"kubernetes.io/projected/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-kube-api-access-p7h6p\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.264676 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.264688 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.279394 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" (UID: "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.282106 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-config" (OuterVolumeSpecName: "config") pod "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" (UID: "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.332164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" (UID: "49d8ca8f-4d78-40ba-b62c-7329c59cb7c5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.370544 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.370577 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.370587 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.581912 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e36b-account-create-update-mkc5t"] Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.750279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2dfd536a-310d-4039-a397-2bcdcdc0c2c2","Type":"ContainerStarted","Data":"784975d247e44b158cdf3eec84a4eb28c92ce4c6bedf4281cf4e12185797fcff"} Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.846058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4519ccde-5d1c-49b1-9f7b-4af0d2f2b557","Type":"ContainerDied","Data":"24c3e451888c9aae9218fe9a33b60d7d7854a843c7b629eeafa652da78a83921"} Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.846117 4792 scope.go:117] "RemoveContainer" containerID="9da96e68f54a31c634cb57d0f931a08af976b5765b06f3bb35dabdd75922540e" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.846361 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.858021 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.283648529 podStartE2EDuration="18.857997655s" podCreationTimestamp="2026-03-18 15:59:11 +0000 UTC" firstStartedPulling="2026-03-18 15:59:12.022344033 +0000 UTC m=+1500.891672970" lastFinishedPulling="2026-03-18 15:59:28.596693159 +0000 UTC m=+1517.466022096" observedRunningTime="2026-03-18 15:59:29.848684701 +0000 UTC m=+1518.718013638" watchObservedRunningTime="2026-03-18 15:59:29.857997655 +0000 UTC m=+1518.727326592" Mar 18 15:59:29 crc kubenswrapper[4792]: E0318 15:59:29.892153 4792 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/cinder-api-0_openstack_cinder-api-9da96e68f54a31c634cb57d0f931a08af976b5765b06f3bb35dabdd75922540e.log: no such file or directory" path="/var/log/containers/cinder-api-0_openstack_cinder-api-9da96e68f54a31c634cb57d0f931a08af976b5765b06f3bb35dabdd75922540e.log" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.902240 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76vww" event={"ID":"971cd0c7-f215-44c4-bb0f-8930af5c49c5","Type":"ContainerStarted","Data":"055207d89d04d040cf5641167a0e4e5da47d11e070514494fc1b958ab62c4afe"} Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.915661 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56b66ddf48-4mkl2" Mar 18 15:59:29 crc kubenswrapper[4792]: I0318 15:59:29.915714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56b66ddf48-4mkl2" event={"ID":"49d8ca8f-4d78-40ba-b62c-7329c59cb7c5","Type":"ContainerDied","Data":"c3e53a3caa1d5f681892bf432c47aba0ba48cddccef54c71bca02e9a811b1912"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.022829 4792 scope.go:117] "RemoveContainer" containerID="e99c5656915b0bc8fad1f3cf92681aabbc27bb4893a1b5fb8674bd08fc4ca146" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.085221 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.104252 4792 scope.go:117] "RemoveContainer" containerID="e83dd235cb7655cc32dc1b455f73a87297d7a65837faf67dec7c6873f58184f1" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.113020 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.141198 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56b66ddf48-4mkl2"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.153524 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56b66ddf48-4mkl2"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.165802 4792 scope.go:117] "RemoveContainer" containerID="a3fc31e388b6c2a17bc5a418f4aee15bc39270924b3f2af99fdb44f8479a617e" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.169176 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:59:30 crc kubenswrapper[4792]: E0318 15:59:30.169734 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api-log" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.169750 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api-log" Mar 18 15:59:30 crc kubenswrapper[4792]: E0318 15:59:30.169775 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-httpd" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.169781 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-httpd" Mar 18 15:59:30 crc kubenswrapper[4792]: E0318 15:59:30.169802 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-api" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.169808 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-api" Mar 18 15:59:30 crc kubenswrapper[4792]: E0318 15:59:30.169824 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.169829 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.170095 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.170124 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" containerName="cinder-api-log" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.170147 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-httpd" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.170156 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" containerName="neutron-api" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.171806 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.174205 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.176223 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.182797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.219407 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.253591 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3c34-account-create-update-r5zbr"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.269217 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7llzh"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.277888 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.279008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6d4c\" (UniqueName: \"kubernetes.io/projected/5f827383-b345-4dd5-958f-54a72cb634b7-kube-api-access-n6d4c\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.279215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.279400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.279823 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f827383-b345-4dd5-958f-54a72cb634b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.279922 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-scripts\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.280038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f827383-b345-4dd5-958f-54a72cb634b7-logs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.280146 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-config-data\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.280311 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.285281 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88f3-account-create-update-2s8l9"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.322926 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.323494 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.370075 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mgrlp"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382504 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382595 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382630 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6d4c\" (UniqueName: \"kubernetes.io/projected/5f827383-b345-4dd5-958f-54a72cb634b7-kube-api-access-n6d4c\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382797 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382914 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f827383-b345-4dd5-958f-54a72cb634b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-scripts\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.382999 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f827383-b345-4dd5-958f-54a72cb634b7-logs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.383025 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-config-data\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.385054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f827383-b345-4dd5-958f-54a72cb634b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.391128 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7984d74779-bxqk7"] Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.391431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f827383-b345-4dd5-958f-54a72cb634b7-logs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.396671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-scripts\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.397322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-config-data\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.400951 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.410157 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.415157 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.429130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f827383-b345-4dd5-958f-54a72cb634b7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.439423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6d4c\" (UniqueName: \"kubernetes.io/projected/5f827383-b345-4dd5-958f-54a72cb634b7-kube-api-access-n6d4c\") pod \"cinder-api-0\" (UID: \"5f827383-b345-4dd5-958f-54a72cb634b7\") " pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.512960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 15:59:30 crc kubenswrapper[4792]: E0318 15:59:30.635020 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d8ca8f_4d78_40ba_b62c_7329c59cb7c5.slice/crio-c3e53a3caa1d5f681892bf432c47aba0ba48cddccef54c71bca02e9a811b1912\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4519ccde_5d1c_49b1_9f7b_4af0d2f2b557.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4519ccde_5d1c_49b1_9f7b_4af0d2f2b557.slice/crio-24c3e451888c9aae9218fe9a33b60d7d7854a843c7b629eeafa652da78a83921\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d8ca8f_4d78_40ba_b62c_7329c59cb7c5.slice\": RecentStats: unable to find data in memory cache]" Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.942929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" event={"ID":"1f56bb52-48fd-427b-9524-2074c22df4b0","Type":"ContainerStarted","Data":"69601541e8ce3e6c660d4cae9abdc94d2bade11628888ec0c8517b9f2d164486"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.946963 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerStarted","Data":"96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.955469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3c34-account-create-update-r5zbr" event={"ID":"39938593-0553-4883-976f-d412c79c5357","Type":"ContainerStarted","Data":"9a83cc46bc49d91be598fe271e367a63bc4d141be71e8005821bcb29fa707f11"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.969164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7984d74779-bxqk7" event={"ID":"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f","Type":"ContainerStarted","Data":"a307ba8b1b79b67587768766b49b6519723170b3293b13c2f660f6b8f64812e5"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.982789 4792 generic.go:334] "Generic (PLEG): container finished" podID="cfb2ab4d-b4cd-4ada-9d14-70d845630eba" containerID="34adbb44391dcc67203b790c1e5370e397aebdbbeda2498d2118d425de02ed02" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.982997 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" event={"ID":"cfb2ab4d-b4cd-4ada-9d14-70d845630eba","Type":"ContainerDied","Data":"34adbb44391dcc67203b790c1e5370e397aebdbbeda2498d2118d425de02ed02"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.983048 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" event={"ID":"cfb2ab4d-b4cd-4ada-9d14-70d845630eba","Type":"ContainerStarted","Data":"76c4ddfce9c1da69b5b7755470b32972f24468d4fbc1cbdb4860c5dbe3fed068"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.990228 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mgrlp" event={"ID":"31f349af-14ee-488d-ab71-3be43d3950ce","Type":"ContainerStarted","Data":"4173d0a43c535c45374715b631ced19e44fb984d09a51c4c285951c83610533a"} Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.996279 4792 generic.go:334] "Generic (PLEG): container finished" podID="971cd0c7-f215-44c4-bb0f-8930af5c49c5" containerID="af40697eace1857ccb6cd8dbdb5ab5a314d44de4eab943998d393beda405dfd8" exitCode=0 Mar 18 15:59:30 crc kubenswrapper[4792]: I0318 15:59:30.996871 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76vww" event={"ID":"971cd0c7-f215-44c4-bb0f-8930af5c49c5","Type":"ContainerDied","Data":"af40697eace1857ccb6cd8dbdb5ab5a314d44de4eab943998d393beda405dfd8"} Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.011819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7llzh" event={"ID":"292e5bc9-bca8-481a-99a8-512e17be912f","Type":"ContainerStarted","Data":"9dc8849535d51eacf39981efb4d3c4c8025868511b6d7820a2d63e2bb705b7d3"} Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.012465 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8hrjt" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="registry-server" containerID="cri-o://319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2" gracePeriod=2 Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.070429 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-7llzh" podStartSLOduration=9.070407976 podStartE2EDuration="9.070407976s" podCreationTimestamp="2026-03-18 15:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:31.053915004 +0000 UTC m=+1519.923243941" watchObservedRunningTime="2026-03-18 15:59:31.070407976 +0000 UTC m=+1519.939736913" Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.159722 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 15:59:31 crc kubenswrapper[4792]: W0318 15:59:31.189234 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f827383_b345_4dd5_958f_54a72cb634b7.slice/crio-f272387030cf508f01b865612d72a70e20f26888979507fd2bd95ec213659537 WatchSource:0}: Error finding container f272387030cf508f01b865612d72a70e20f26888979507fd2bd95ec213659537: Status 404 returned error can't find the container with id f272387030cf508f01b865612d72a70e20f26888979507fd2bd95ec213659537 Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.810376 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.876525 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4519ccde-5d1c-49b1-9f7b-4af0d2f2b557" path="/var/lib/kubelet/pods/4519ccde-5d1c-49b1-9f7b-4af0d2f2b557/volumes" Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.877270 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d8ca8f-4d78-40ba-b62c-7329c59cb7c5" path="/var/lib/kubelet/pods/49d8ca8f-4d78-40ba-b62c-7329c59cb7c5/volumes" Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.919960 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-utilities" (OuterVolumeSpecName: "utilities") pod "82382aff-7beb-4e33-8a05-5f58f2d3b299" (UID: "82382aff-7beb-4e33-8a05-5f58f2d3b299"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.923871 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-utilities\") pod \"82382aff-7beb-4e33-8a05-5f58f2d3b299\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.924194 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-catalog-content\") pod \"82382aff-7beb-4e33-8a05-5f58f2d3b299\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.924347 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwrp4\" (UniqueName: \"kubernetes.io/projected/82382aff-7beb-4e33-8a05-5f58f2d3b299-kube-api-access-vwrp4\") pod \"82382aff-7beb-4e33-8a05-5f58f2d3b299\" (UID: \"82382aff-7beb-4e33-8a05-5f58f2d3b299\") " Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.925506 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:31 crc kubenswrapper[4792]: I0318 15:59:31.932561 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82382aff-7beb-4e33-8a05-5f58f2d3b299-kube-api-access-vwrp4" (OuterVolumeSpecName: "kube-api-access-vwrp4") pod "82382aff-7beb-4e33-8a05-5f58f2d3b299" (UID: "82382aff-7beb-4e33-8a05-5f58f2d3b299"). InnerVolumeSpecName "kube-api-access-vwrp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.026063 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f827383-b345-4dd5-958f-54a72cb634b7","Type":"ContainerStarted","Data":"f272387030cf508f01b865612d72a70e20f26888979507fd2bd95ec213659537"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.028526 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwrp4\" (UniqueName: \"kubernetes.io/projected/82382aff-7beb-4e33-8a05-5f58f2d3b299-kube-api-access-vwrp4\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.029020 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7984d74779-bxqk7" event={"ID":"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f","Type":"ContainerStarted","Data":"4d1fdefac64be55c571e8fb97cb4a6f2352a857f2057d6d840a419a6b56b63e3"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.029061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7984d74779-bxqk7" event={"ID":"e19c85aa-c5a1-4d0d-99ff-cc9283e5252f","Type":"ContainerStarted","Data":"77f77040c21ca6a43098e9cf11d30bbd95857a89a5f37855ad02f605e5238fca"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.029261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.029302 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.037772 4792 generic.go:334] "Generic (PLEG): container finished" podID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerID="319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2" exitCode=0 Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.037858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hrjt" event={"ID":"82382aff-7beb-4e33-8a05-5f58f2d3b299","Type":"ContainerDied","Data":"319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.037894 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hrjt" event={"ID":"82382aff-7beb-4e33-8a05-5f58f2d3b299","Type":"ContainerDied","Data":"77d8afd4ba7670ab3c680cc27f3ba2449735ae906231a897f2ae7a9975c4534f"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.037915 4792 scope.go:117] "RemoveContainer" containerID="319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.038084 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hrjt" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.052366 4792 generic.go:334] "Generic (PLEG): container finished" podID="292e5bc9-bca8-481a-99a8-512e17be912f" containerID="c295e5f09de2193dca4d026669e6ca060797e588a59a67ab1332a95d0a7f8b47" exitCode=0 Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.052478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7llzh" event={"ID":"292e5bc9-bca8-481a-99a8-512e17be912f","Type":"ContainerDied","Data":"c295e5f09de2193dca4d026669e6ca060797e588a59a67ab1332a95d0a7f8b47"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.079861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerStarted","Data":"74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.083565 4792 generic.go:334] "Generic (PLEG): container finished" podID="1f56bb52-48fd-427b-9524-2074c22df4b0" containerID="cbfd6fd049430a5385b4ef7fbdd7f9e73fe1a3da8ce0127d46b938f01729d8e8" exitCode=0 Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.083660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" event={"ID":"1f56bb52-48fd-427b-9524-2074c22df4b0","Type":"ContainerDied","Data":"cbfd6fd049430a5385b4ef7fbdd7f9e73fe1a3da8ce0127d46b938f01729d8e8"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.088891 4792 scope.go:117] "RemoveContainer" containerID="7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.089581 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7984d74779-bxqk7" podStartSLOduration=13.089554391 podStartE2EDuration="13.089554391s" podCreationTimestamp="2026-03-18 15:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:32.067113231 +0000 UTC m=+1520.936442168" watchObservedRunningTime="2026-03-18 15:59:32.089554391 +0000 UTC m=+1520.958883328" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.092370 4792 generic.go:334] "Generic (PLEG): container finished" podID="31f349af-14ee-488d-ab71-3be43d3950ce" containerID="379da200892de03b2fa0a80240e5ab28e591cd78074fd5ef469c363ae8e0e4ce" exitCode=0 Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.092443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mgrlp" event={"ID":"31f349af-14ee-488d-ab71-3be43d3950ce","Type":"ContainerDied","Data":"379da200892de03b2fa0a80240e5ab28e591cd78074fd5ef469c363ae8e0e4ce"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.096409 4792 generic.go:334] "Generic (PLEG): container finished" podID="39938593-0553-4883-976f-d412c79c5357" containerID="7cb3bd3b1a557c965a86d2d607d301f23df95f4ecc58fadafc0fe7bceacbf9e6" exitCode=0 Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.096601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3c34-account-create-update-r5zbr" event={"ID":"39938593-0553-4883-976f-d412c79c5357","Type":"ContainerDied","Data":"7cb3bd3b1a557c965a86d2d607d301f23df95f4ecc58fadafc0fe7bceacbf9e6"} Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.142120 4792 scope.go:117] "RemoveContainer" containerID="244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.194340 4792 scope.go:117] "RemoveContainer" containerID="319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2" Mar 18 15:59:32 crc kubenswrapper[4792]: E0318 15:59:32.198688 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2\": container with ID starting with 319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2 not found: ID does not exist" containerID="319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.198739 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2"} err="failed to get container status \"319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2\": rpc error: code = NotFound desc = could not find container \"319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2\": container with ID starting with 319f9f919c38473760620b603c71a81918b677c333f2c72f6a16a3a96049c9e2 not found: ID does not exist" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.198768 4792 scope.go:117] "RemoveContainer" containerID="7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea" Mar 18 15:59:32 crc kubenswrapper[4792]: E0318 15:59:32.217896 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea\": container with ID starting with 7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea not found: ID does not exist" containerID="7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.217944 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea"} err="failed to get container status \"7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea\": rpc error: code = NotFound desc = could not find container \"7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea\": container with ID starting with 7286f04c6da58c070120774e33aa4fbc62c3b110b8e1dc893f272c0a84959fea not found: ID does not exist" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.217996 4792 scope.go:117] "RemoveContainer" containerID="244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24" Mar 18 15:59:32 crc kubenswrapper[4792]: E0318 15:59:32.226163 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24\": container with ID starting with 244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24 not found: ID does not exist" containerID="244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.226210 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24"} err="failed to get container status \"244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24\": rpc error: code = NotFound desc = could not find container \"244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24\": container with ID starting with 244a6e3c97e875b33b82cdd43c92ac9516f32545388c667d76202da2a6001d24 not found: ID does not exist" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.292276 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82382aff-7beb-4e33-8a05-5f58f2d3b299" (UID: "82382aff-7beb-4e33-8a05-5f58f2d3b299"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.361952 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82382aff-7beb-4e33-8a05-5f58f2d3b299-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.539040 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hrjt"] Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.572558 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8hrjt"] Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.869053 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.978158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg756\" (UniqueName: \"kubernetes.io/projected/971cd0c7-f215-44c4-bb0f-8930af5c49c5-kube-api-access-mg756\") pod \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.978315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/971cd0c7-f215-44c4-bb0f-8930af5c49c5-operator-scripts\") pod \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\" (UID: \"971cd0c7-f215-44c4-bb0f-8930af5c49c5\") " Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.988382 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971cd0c7-f215-44c4-bb0f-8930af5c49c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "971cd0c7-f215-44c4-bb0f-8930af5c49c5" (UID: "971cd0c7-f215-44c4-bb0f-8930af5c49c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:32 crc kubenswrapper[4792]: I0318 15:59:32.997828 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/971cd0c7-f215-44c4-bb0f-8930af5c49c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.004327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971cd0c7-f215-44c4-bb0f-8930af5c49c5-kube-api-access-mg756" (OuterVolumeSpecName: "kube-api-access-mg756") pod "971cd0c7-f215-44c4-bb0f-8930af5c49c5" (UID: "971cd0c7-f215-44c4-bb0f-8930af5c49c5"). InnerVolumeSpecName "kube-api-access-mg756". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.015046 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.099424 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-operator-scripts\") pod \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.099491 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqtl8\" (UniqueName: \"kubernetes.io/projected/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-kube-api-access-nqtl8\") pod \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\" (UID: \"cfb2ab4d-b4cd-4ada-9d14-70d845630eba\") " Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.100060 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg756\" (UniqueName: \"kubernetes.io/projected/971cd0c7-f215-44c4-bb0f-8930af5c49c5-kube-api-access-mg756\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.102417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfb2ab4d-b4cd-4ada-9d14-70d845630eba" (UID: "cfb2ab4d-b4cd-4ada-9d14-70d845630eba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.108253 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-kube-api-access-nqtl8" (OuterVolumeSpecName: "kube-api-access-nqtl8") pod "cfb2ab4d-b4cd-4ada-9d14-70d845630eba" (UID: "cfb2ab4d-b4cd-4ada-9d14-70d845630eba"). InnerVolumeSpecName "kube-api-access-nqtl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.129491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76vww" event={"ID":"971cd0c7-f215-44c4-bb0f-8930af5c49c5","Type":"ContainerDied","Data":"055207d89d04d040cf5641167a0e4e5da47d11e070514494fc1b958ab62c4afe"} Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.129537 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="055207d89d04d040cf5641167a0e4e5da47d11e070514494fc1b958ab62c4afe" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.129678 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76vww" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.146805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerStarted","Data":"b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5"} Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.151781 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f827383-b345-4dd5-958f-54a72cb634b7","Type":"ContainerStarted","Data":"124f9a33bd2ff1d76870c9f9996b2c64127b552e94b78f5a549a325358273674"} Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.160031 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.160123 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e36b-account-create-update-mkc5t" event={"ID":"cfb2ab4d-b4cd-4ada-9d14-70d845630eba","Type":"ContainerDied","Data":"76c4ddfce9c1da69b5b7755470b32972f24468d4fbc1cbdb4860c5dbe3fed068"} Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.160211 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c4ddfce9c1da69b5b7755470b32972f24468d4fbc1cbdb4860c5dbe3fed068" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.203013 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqtl8\" (UniqueName: \"kubernetes.io/projected/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-kube-api-access-nqtl8\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.203047 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb2ab4d-b4cd-4ada-9d14-70d845630eba-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.837100 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.898283 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" path="/var/lib/kubelet/pods/82382aff-7beb-4e33-8a05-5f58f2d3b299/volumes" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.931167 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39938593-0553-4883-976f-d412c79c5357-operator-scripts\") pod \"39938593-0553-4883-976f-d412c79c5357\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.931480 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hq99\" (UniqueName: \"kubernetes.io/projected/39938593-0553-4883-976f-d412c79c5357-kube-api-access-8hq99\") pod \"39938593-0553-4883-976f-d412c79c5357\" (UID: \"39938593-0553-4883-976f-d412c79c5357\") " Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.935243 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39938593-0553-4883-976f-d412c79c5357-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39938593-0553-4883-976f-d412c79c5357" (UID: "39938593-0553-4883-976f-d412c79c5357"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:33 crc kubenswrapper[4792]: I0318 15:59:33.945153 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39938593-0553-4883-976f-d412c79c5357-kube-api-access-8hq99" (OuterVolumeSpecName: "kube-api-access-8hq99") pod "39938593-0553-4883-976f-d412c79c5357" (UID: "39938593-0553-4883-976f-d412c79c5357"). InnerVolumeSpecName "kube-api-access-8hq99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:33.999535 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.018105 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.029142 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.034267 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39938593-0553-4883-976f-d412c79c5357-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.034301 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hq99\" (UniqueName: \"kubernetes.io/projected/39938593-0553-4883-976f-d412c79c5357-kube-api-access-8hq99\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.142671 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snf8t\" (UniqueName: \"kubernetes.io/projected/1f56bb52-48fd-427b-9524-2074c22df4b0-kube-api-access-snf8t\") pod \"1f56bb52-48fd-427b-9524-2074c22df4b0\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.142743 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpcbf\" (UniqueName: \"kubernetes.io/projected/31f349af-14ee-488d-ab71-3be43d3950ce-kube-api-access-dpcbf\") pod \"31f349af-14ee-488d-ab71-3be43d3950ce\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.142797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f56bb52-48fd-427b-9524-2074c22df4b0-operator-scripts\") pod \"1f56bb52-48fd-427b-9524-2074c22df4b0\" (UID: \"1f56bb52-48fd-427b-9524-2074c22df4b0\") " Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.142834 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f349af-14ee-488d-ab71-3be43d3950ce-operator-scripts\") pod \"31f349af-14ee-488d-ab71-3be43d3950ce\" (UID: \"31f349af-14ee-488d-ab71-3be43d3950ce\") " Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.142866 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4cj4\" (UniqueName: \"kubernetes.io/projected/292e5bc9-bca8-481a-99a8-512e17be912f-kube-api-access-g4cj4\") pod \"292e5bc9-bca8-481a-99a8-512e17be912f\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.142937 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292e5bc9-bca8-481a-99a8-512e17be912f-operator-scripts\") pod \"292e5bc9-bca8-481a-99a8-512e17be912f\" (UID: \"292e5bc9-bca8-481a-99a8-512e17be912f\") " Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.144006 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/292e5bc9-bca8-481a-99a8-512e17be912f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "292e5bc9-bca8-481a-99a8-512e17be912f" (UID: "292e5bc9-bca8-481a-99a8-512e17be912f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.144779 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f56bb52-48fd-427b-9524-2074c22df4b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f56bb52-48fd-427b-9524-2074c22df4b0" (UID: "1f56bb52-48fd-427b-9524-2074c22df4b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.145436 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f349af-14ee-488d-ab71-3be43d3950ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31f349af-14ee-488d-ab71-3be43d3950ce" (UID: "31f349af-14ee-488d-ab71-3be43d3950ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.149143 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f349af-14ee-488d-ab71-3be43d3950ce-kube-api-access-dpcbf" (OuterVolumeSpecName: "kube-api-access-dpcbf") pod "31f349af-14ee-488d-ab71-3be43d3950ce" (UID: "31f349af-14ee-488d-ab71-3be43d3950ce"). InnerVolumeSpecName "kube-api-access-dpcbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.150149 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292e5bc9-bca8-481a-99a8-512e17be912f-kube-api-access-g4cj4" (OuterVolumeSpecName: "kube-api-access-g4cj4") pod "292e5bc9-bca8-481a-99a8-512e17be912f" (UID: "292e5bc9-bca8-481a-99a8-512e17be912f"). InnerVolumeSpecName "kube-api-access-g4cj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.152114 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f56bb52-48fd-427b-9524-2074c22df4b0-kube-api-access-snf8t" (OuterVolumeSpecName: "kube-api-access-snf8t") pod "1f56bb52-48fd-427b-9524-2074c22df4b0" (UID: "1f56bb52-48fd-427b-9524-2074c22df4b0"). InnerVolumeSpecName "kube-api-access-snf8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.194783 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7llzh" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.194889 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7llzh" event={"ID":"292e5bc9-bca8-481a-99a8-512e17be912f","Type":"ContainerDied","Data":"9dc8849535d51eacf39981efb4d3c4c8025868511b6d7820a2d63e2bb705b7d3"} Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.194928 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dc8849535d51eacf39981efb4d3c4c8025868511b6d7820a2d63e2bb705b7d3" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.197126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" event={"ID":"1f56bb52-48fd-427b-9524-2074c22df4b0","Type":"ContainerDied","Data":"69601541e8ce3e6c660d4cae9abdc94d2bade11628888ec0c8517b9f2d164486"} Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.197213 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69601541e8ce3e6c660d4cae9abdc94d2bade11628888ec0c8517b9f2d164486" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.197173 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88f3-account-create-update-2s8l9" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.198930 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mgrlp" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.199130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mgrlp" event={"ID":"31f349af-14ee-488d-ab71-3be43d3950ce","Type":"ContainerDied","Data":"4173d0a43c535c45374715b631ced19e44fb984d09a51c4c285951c83610533a"} Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.199161 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4173d0a43c535c45374715b631ced19e44fb984d09a51c4c285951c83610533a" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.202703 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3c34-account-create-update-r5zbr" event={"ID":"39938593-0553-4883-976f-d412c79c5357","Type":"ContainerDied","Data":"9a83cc46bc49d91be598fe271e367a63bc4d141be71e8005821bcb29fa707f11"} Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.202735 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a83cc46bc49d91be598fe271e367a63bc4d141be71e8005821bcb29fa707f11" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.202779 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3c34-account-create-update-r5zbr" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.217696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5f827383-b345-4dd5-958f-54a72cb634b7","Type":"ContainerStarted","Data":"a26788661a0d8810e1bf50bdf1d923fa4881f8b8fc4e7815a8696fed67dec304"} Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.217982 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.245640 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snf8t\" (UniqueName: \"kubernetes.io/projected/1f56bb52-48fd-427b-9524-2074c22df4b0-kube-api-access-snf8t\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.245675 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpcbf\" (UniqueName: \"kubernetes.io/projected/31f349af-14ee-488d-ab71-3be43d3950ce-kube-api-access-dpcbf\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.245685 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f56bb52-48fd-427b-9524-2074c22df4b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.245694 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f349af-14ee-488d-ab71-3be43d3950ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.245702 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4cj4\" (UniqueName: \"kubernetes.io/projected/292e5bc9-bca8-481a-99a8-512e17be912f-kube-api-access-g4cj4\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.245711 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292e5bc9-bca8-481a-99a8-512e17be912f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:34 crc kubenswrapper[4792]: I0318 15:59:34.247202 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.247179077 podStartE2EDuration="4.247179077s" podCreationTimestamp="2026-03-18 15:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:34.239214355 +0000 UTC m=+1523.108543292" watchObservedRunningTime="2026-03-18 15:59:34.247179077 +0000 UTC m=+1523.116508034" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.014224 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-55c78ffbd6-zkvtj"] Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015128 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="registry-server" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015145 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="registry-server" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015179 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb2ab4d-b4cd-4ada-9d14-70d845630eba" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015187 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb2ab4d-b4cd-4ada-9d14-70d845630eba" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015216 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="extract-content" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015224 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="extract-content" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015240 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f349af-14ee-488d-ab71-3be43d3950ce" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015247 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f349af-14ee-488d-ab71-3be43d3950ce" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015260 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292e5bc9-bca8-481a-99a8-512e17be912f" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015266 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="292e5bc9-bca8-481a-99a8-512e17be912f" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015281 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39938593-0553-4883-976f-d412c79c5357" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015287 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="39938593-0553-4883-976f-d412c79c5357" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015297 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="extract-utilities" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015303 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="extract-utilities" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015313 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f56bb52-48fd-427b-9524-2074c22df4b0" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015320 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f56bb52-48fd-427b-9524-2074c22df4b0" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: E0318 15:59:35.015330 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971cd0c7-f215-44c4-bb0f-8930af5c49c5" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015336 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="971cd0c7-f215-44c4-bb0f-8930af5c49c5" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015569 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="971cd0c7-f215-44c4-bb0f-8930af5c49c5" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015583 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="292e5bc9-bca8-481a-99a8-512e17be912f" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015594 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f349af-14ee-488d-ab71-3be43d3950ce" containerName="mariadb-database-create" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015608 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="82382aff-7beb-4e33-8a05-5f58f2d3b299" containerName="registry-server" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015623 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb2ab4d-b4cd-4ada-9d14-70d845630eba" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015636 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f56bb52-48fd-427b-9524-2074c22df4b0" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.015651 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="39938593-0553-4883-976f-d412c79c5357" containerName="mariadb-account-create-update" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.016589 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.018680 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-mncxd" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.019695 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.020358 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.034274 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55c78ffbd6-zkvtj"] Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.202368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.202445 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data-custom\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.202471 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-combined-ca-bundle\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.202597 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvfpb\" (UniqueName: \"kubernetes.io/projected/3019eb4a-4185-4235-97dd-4f3accae8352-kube-api-access-cvfpb\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.280061 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7f48cb748-m6mrx"] Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.282256 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.300522 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.301362 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f48cb748-m6mrx"] Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.306629 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerStarted","Data":"f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4"} Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.308696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvfpb\" (UniqueName: \"kubernetes.io/projected/3019eb4a-4185-4235-97dd-4f3accae8352-kube-api-access-cvfpb\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.308902 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.308945 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data-custom\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.308984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-combined-ca-bundle\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.309517 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-central-agent" containerID="cri-o://96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82" gracePeriod=30 Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.309686 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.309747 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="proxy-httpd" containerID="cri-o://f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4" gracePeriod=30 Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.309804 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="sg-core" containerID="cri-o://b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5" gracePeriod=30 Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.309854 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-notification-agent" containerID="cri-o://74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f" gracePeriod=30 Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.338039 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-combined-ca-bundle\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.357782 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.379384 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bdzhh"] Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.382374 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.382584 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvfpb\" (UniqueName: \"kubernetes.io/projected/3019eb4a-4185-4235-97dd-4f3accae8352-kube-api-access-cvfpb\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.387154 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data-custom\") pod \"heat-engine-55c78ffbd6-zkvtj\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.411458 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn4kx\" (UniqueName: \"kubernetes.io/projected/45f058b5-e315-4e43-a124-9bb0be63d5fc-kube-api-access-jn4kx\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.411801 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.411990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-combined-ca-bundle\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.412273 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data-custom\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.459041 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bdzhh"] Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.459641 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.517438 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflkg\" (UniqueName: \"kubernetes.io/projected/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-kube-api-access-fflkg\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.517680 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.517768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn4kx\" (UniqueName: \"kubernetes.io/projected/45f058b5-e315-4e43-a124-9bb0be63d5fc-kube-api-access-jn4kx\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.517843 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.517992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-combined-ca-bundle\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.518087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.518176 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data-custom\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.518273 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.518361 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.518443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-config\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.530790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data-custom\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.539477 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.539559 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-87c7b6cff-mr5gh"] Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.551675 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.558136 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.564420 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-combined-ca-bundle\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.594241 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn4kx\" (UniqueName: \"kubernetes.io/projected/45f058b5-e315-4e43-a124-9bb0be63d5fc-kube-api-access-jn4kx\") pod \"heat-cfnapi-7f48cb748-m6mrx\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.614670 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=12.834025467 podStartE2EDuration="18.614646424s" podCreationTimestamp="2026-03-18 15:59:17 +0000 UTC" firstStartedPulling="2026-03-18 15:59:28.276124895 +0000 UTC m=+1517.145453832" lastFinishedPulling="2026-03-18 15:59:34.056745852 +0000 UTC m=+1522.926074789" observedRunningTime="2026-03-18 15:59:35.440557155 +0000 UTC m=+1524.309886102" watchObservedRunningTime="2026-03-18 15:59:35.614646424 +0000 UTC m=+1524.483975391" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.619114 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-87c7b6cff-mr5gh"] Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.620674 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.620749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.620788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-config\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.628336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflkg\" (UniqueName: \"kubernetes.io/projected/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-kube-api-access-fflkg\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.628446 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.628683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.623761 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.623137 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.631375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.624811 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-config\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.636095 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.655940 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflkg\" (UniqueName: \"kubernetes.io/projected/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-kube-api-access-fflkg\") pod \"dnsmasq-dns-688b9f5b49-bdzhh\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.656617 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.732138 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data-custom\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.732243 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djq6v\" (UniqueName: \"kubernetes.io/projected/3fd853fa-1d25-4065-ae01-709ad8473497-kube-api-access-djq6v\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.732330 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-combined-ca-bundle\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.732459 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.803682 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.837448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data-custom\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.837634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djq6v\" (UniqueName: \"kubernetes.io/projected/3fd853fa-1d25-4065-ae01-709ad8473497-kube-api-access-djq6v\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.837775 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-combined-ca-bundle\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.838023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.849117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-combined-ca-bundle\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.850041 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data-custom\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.850173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:35 crc kubenswrapper[4792]: I0318 15:59:35.874660 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djq6v\" (UniqueName: \"kubernetes.io/projected/3fd853fa-1d25-4065-ae01-709ad8473497-kube-api-access-djq6v\") pod \"heat-api-87c7b6cff-mr5gh\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:36 crc kubenswrapper[4792]: I0318 15:59:36.054946 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:36 crc kubenswrapper[4792]: I0318 15:59:36.361930 4792 generic.go:334] "Generic (PLEG): container finished" podID="a600353b-7ea9-490d-965b-ff93147642c4" containerID="b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5" exitCode=2 Mar 18 15:59:36 crc kubenswrapper[4792]: I0318 15:59:36.362460 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerDied","Data":"b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5"} Mar 18 15:59:36 crc kubenswrapper[4792]: I0318 15:59:36.412383 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55c78ffbd6-zkvtj"] Mar 18 15:59:36 crc kubenswrapper[4792]: W0318 15:59:36.449065 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f058b5_e315_4e43_a124_9bb0be63d5fc.slice/crio-10e672eeafc4ce0c5c9b7750ba74ef0c8565810914c1e95e5d4ec34591d66958 WatchSource:0}: Error finding container 10e672eeafc4ce0c5c9b7750ba74ef0c8565810914c1e95e5d4ec34591d66958: Status 404 returned error can't find the container with id 10e672eeafc4ce0c5c9b7750ba74ef0c8565810914c1e95e5d4ec34591d66958 Mar 18 15:59:36 crc kubenswrapper[4792]: I0318 15:59:36.465640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f48cb748-m6mrx"] Mar 18 15:59:36 crc kubenswrapper[4792]: I0318 15:59:36.645660 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bdzhh"] Mar 18 15:59:36 crc kubenswrapper[4792]: W0318 15:59:36.659305 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6d39a94_cc37_4e16_91ad_a3a2a3fdda9f.slice/crio-7e5ac2061537a60c39384eb15e494d79c8b26def08deb4adf3162dd0d4283474 WatchSource:0}: Error finding container 7e5ac2061537a60c39384eb15e494d79c8b26def08deb4adf3162dd0d4283474: Status 404 returned error can't find the container with id 7e5ac2061537a60c39384eb15e494d79c8b26def08deb4adf3162dd0d4283474 Mar 18 15:59:36 crc kubenswrapper[4792]: I0318 15:59:36.889908 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-87c7b6cff-mr5gh"] Mar 18 15:59:36 crc kubenswrapper[4792]: W0318 15:59:36.900664 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fd853fa_1d25_4065_ae01_709ad8473497.slice/crio-429b3d0789029e0b5a534b50c1eba51ae0cf527956e68128cd66ddc76cd9ffb1 WatchSource:0}: Error finding container 429b3d0789029e0b5a534b50c1eba51ae0cf527956e68128cd66ddc76cd9ffb1: Status 404 returned error can't find the container with id 429b3d0789029e0b5a534b50c1eba51ae0cf527956e68128cd66ddc76cd9ffb1 Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.390709 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" event={"ID":"45f058b5-e315-4e43-a124-9bb0be63d5fc","Type":"ContainerStarted","Data":"10e672eeafc4ce0c5c9b7750ba74ef0c8565810914c1e95e5d4ec34591d66958"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.398951 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-87c7b6cff-mr5gh" event={"ID":"3fd853fa-1d25-4065-ae01-709ad8473497","Type":"ContainerStarted","Data":"429b3d0789029e0b5a534b50c1eba51ae0cf527956e68128cd66ddc76cd9ffb1"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.405333 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55c78ffbd6-zkvtj" event={"ID":"3019eb4a-4185-4235-97dd-4f3accae8352","Type":"ContainerStarted","Data":"86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.405391 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55c78ffbd6-zkvtj" event={"ID":"3019eb4a-4185-4235-97dd-4f3accae8352","Type":"ContainerStarted","Data":"76b0c8e62f06cc0b4012175ff2bd5e2ade62041febacebecdadf5f4f3cc404f3"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.405530 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.413516 4792 generic.go:334] "Generic (PLEG): container finished" podID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerID="2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1" exitCode=0 Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.413758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" event={"ID":"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f","Type":"ContainerDied","Data":"2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.414145 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" event={"ID":"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f","Type":"ContainerStarted","Data":"7e5ac2061537a60c39384eb15e494d79c8b26def08deb4adf3162dd0d4283474"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.430732 4792 generic.go:334] "Generic (PLEG): container finished" podID="a600353b-7ea9-490d-965b-ff93147642c4" containerID="74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f" exitCode=0 Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.430770 4792 generic.go:334] "Generic (PLEG): container finished" podID="a600353b-7ea9-490d-965b-ff93147642c4" containerID="96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82" exitCode=0 Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.430797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerDied","Data":"74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.430831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerDied","Data":"96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82"} Mar 18 15:59:37 crc kubenswrapper[4792]: I0318 15:59:37.440407 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-55c78ffbd6-zkvtj" podStartSLOduration=3.440381179 podStartE2EDuration="3.440381179s" podCreationTimestamp="2026-03-18 15:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:37.42240719 +0000 UTC m=+1526.291736137" watchObservedRunningTime="2026-03-18 15:59:37.440381179 +0000 UTC m=+1526.309710116" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.453722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" event={"ID":"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f","Type":"ContainerStarted","Data":"4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b"} Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.500726 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" podStartSLOduration=3.500701897 podStartE2EDuration="3.500701897s" podCreationTimestamp="2026-03-18 15:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:38.482368908 +0000 UTC m=+1527.351697845" watchObservedRunningTime="2026-03-18 15:59:38.500701897 +0000 UTC m=+1527.370030834" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.753439 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jjzdm"] Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.755565 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.764198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d6kvz" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.764422 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.765486 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.770306 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jjzdm"] Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.847607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.847727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gffn6\" (UniqueName: \"kubernetes.io/projected/29554117-39ac-4a2b-bd31-4d6858fb7931-kube-api-access-gffn6\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.847780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-config-data\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.847837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-scripts\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.949782 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.949948 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gffn6\" (UniqueName: \"kubernetes.io/projected/29554117-39ac-4a2b-bd31-4d6858fb7931-kube-api-access-gffn6\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.950023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-config-data\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.950096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-scripts\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.966317 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-config-data\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.969613 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.977049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-scripts\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:38 crc kubenswrapper[4792]: I0318 15:59:38.986452 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gffn6\" (UniqueName: \"kubernetes.io/projected/29554117-39ac-4a2b-bd31-4d6858fb7931-kube-api-access-gffn6\") pod \"nova-cell0-conductor-db-sync-jjzdm\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:39 crc kubenswrapper[4792]: I0318 15:59:39.084631 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 15:59:39 crc kubenswrapper[4792]: I0318 15:59:39.463840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.020500 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.020981 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7984d74779-bxqk7" Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.203724 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jjzdm"] Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.477862 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" event={"ID":"45f058b5-e315-4e43-a124-9bb0be63d5fc","Type":"ContainerStarted","Data":"9119ca9c98137af4d67b7b0e56211f071e93407c34b0d86e6d14b21e1aaca9b8"} Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.478883 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.487731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-87c7b6cff-mr5gh" event={"ID":"3fd853fa-1d25-4065-ae01-709ad8473497","Type":"ContainerStarted","Data":"26573bed61b358994e088ef6e2ebef9b522c88108c71885103c961d5cd280654"} Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.491240 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" event={"ID":"29554117-39ac-4a2b-bd31-4d6858fb7931","Type":"ContainerStarted","Data":"ebfce030f3fe73c3648168536a986fbaa7f8149f8b8bf9734b6cc0b145786817"} Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.514026 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" podStartSLOduration=2.435946149 podStartE2EDuration="5.514007357s" podCreationTimestamp="2026-03-18 15:59:35 +0000 UTC" firstStartedPulling="2026-03-18 15:59:36.458168303 +0000 UTC m=+1525.327497240" lastFinishedPulling="2026-03-18 15:59:39.536229511 +0000 UTC m=+1528.405558448" observedRunningTime="2026-03-18 15:59:40.504769095 +0000 UTC m=+1529.374098042" watchObservedRunningTime="2026-03-18 15:59:40.514007357 +0000 UTC m=+1529.383336294" Mar 18 15:59:40 crc kubenswrapper[4792]: I0318 15:59:40.551762 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-87c7b6cff-mr5gh" podStartSLOduration=2.916796792 podStartE2EDuration="5.551743041s" podCreationTimestamp="2026-03-18 15:59:35 +0000 UTC" firstStartedPulling="2026-03-18 15:59:36.904309298 +0000 UTC m=+1525.773638235" lastFinishedPulling="2026-03-18 15:59:39.539255537 +0000 UTC m=+1528.408584484" observedRunningTime="2026-03-18 15:59:40.542503999 +0000 UTC m=+1529.411832936" watchObservedRunningTime="2026-03-18 15:59:40.551743041 +0000 UTC m=+1529.421071968" Mar 18 15:59:41 crc kubenswrapper[4792]: I0318 15:59:41.056515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:43 crc kubenswrapper[4792]: I0318 15:59:43.344687 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.601146 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6c99df75d9-tzn29"] Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.603618 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.617049 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-845984c8c-xtlbs"] Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.619204 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.637419 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c99df75d9-tzn29"] Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.673021 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-845984c8c-xtlbs"] Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.709030 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7846658d4b-mc89v"] Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.710907 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738553 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2pq5\" (UniqueName: \"kubernetes.io/projected/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-kube-api-access-p2pq5\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738656 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-combined-ca-bundle\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data-custom\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738757 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data-custom\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738790 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738846 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-combined-ca-bundle\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.738884 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkzwx\" (UniqueName: \"kubernetes.io/projected/db801d44-72e6-44db-a478-e745ecf3d278-kube-api-access-gkzwx\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.769163 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7846658d4b-mc89v"] Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841611 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data-custom\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841682 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4wl9\" (UniqueName: \"kubernetes.io/projected/3a42eef3-14d6-4b01-b5d0-a0d74399d586-kube-api-access-m4wl9\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841717 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2pq5\" (UniqueName: \"kubernetes.io/projected/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-kube-api-access-p2pq5\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-combined-ca-bundle\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data-custom\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841834 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data-custom\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.841964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-combined-ca-bundle\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.842042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-combined-ca-bundle\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.842075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkzwx\" (UniqueName: \"kubernetes.io/projected/db801d44-72e6-44db-a478-e745ecf3d278-kube-api-access-gkzwx\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.858181 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-combined-ca-bundle\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.860755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-combined-ca-bundle\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.860806 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data-custom\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.861429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data-custom\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.864804 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.867635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkzwx\" (UniqueName: \"kubernetes.io/projected/db801d44-72e6-44db-a478-e745ecf3d278-kube-api-access-gkzwx\") pod \"heat-engine-6c99df75d9-tzn29\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.876779 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.877777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2pq5\" (UniqueName: \"kubernetes.io/projected/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-kube-api-access-p2pq5\") pod \"heat-cfnapi-845984c8c-xtlbs\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.945133 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.945531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-combined-ca-bundle\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.945811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data-custom\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.945939 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4wl9\" (UniqueName: \"kubernetes.io/projected/3a42eef3-14d6-4b01-b5d0-a0d74399d586-kube-api-access-m4wl9\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.950673 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-combined-ca-bundle\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.951065 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data-custom\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.952012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.953081 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.979009 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4wl9\" (UniqueName: \"kubernetes.io/projected/3a42eef3-14d6-4b01-b5d0-a0d74399d586-kube-api-access-m4wl9\") pod \"heat-api-7846658d4b-mc89v\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:44 crc kubenswrapper[4792]: I0318 15:59:44.985734 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:45 crc kubenswrapper[4792]: I0318 15:59:45.038751 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:45 crc kubenswrapper[4792]: I0318 15:59:45.803817 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c99df75d9-tzn29"] Mar 18 15:59:45 crc kubenswrapper[4792]: I0318 15:59:45.808926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.003611 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8khs2"] Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.003908 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" podUID="5378b261-2167-4eae-a672-0dd816a99a18" containerName="dnsmasq-dns" containerID="cri-o://a795f848d128a13d931b826d0d66939ba57886ba1e016d95571a680f31e40671" gracePeriod=10 Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.069486 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-845984c8c-xtlbs"] Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.102725 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7846658d4b-mc89v"] Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.619356 4792 generic.go:334] "Generic (PLEG): container finished" podID="5378b261-2167-4eae-a672-0dd816a99a18" containerID="a795f848d128a13d931b826d0d66939ba57886ba1e016d95571a680f31e40671" exitCode=0 Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.619816 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" event={"ID":"5378b261-2167-4eae-a672-0dd816a99a18","Type":"ContainerDied","Data":"a795f848d128a13d931b826d0d66939ba57886ba1e016d95571a680f31e40671"} Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.628104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-845984c8c-xtlbs" event={"ID":"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c","Type":"ContainerStarted","Data":"1537a60b37149d2a0bee3bface44cbc26ed5f0d1215b76c0de7c34b36df22e8e"} Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.635401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7846658d4b-mc89v" event={"ID":"3a42eef3-14d6-4b01-b5d0-a0d74399d586","Type":"ContainerStarted","Data":"4585100fe59d009bd6bc88503c28d8d971e676a9085c268256f6ab5fbd446d74"} Mar 18 15:59:46 crc kubenswrapper[4792]: I0318 15:59:46.641314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c99df75d9-tzn29" event={"ID":"db801d44-72e6-44db-a478-e745ecf3d278","Type":"ContainerStarted","Data":"f9ab5c1eacc3fe53f1d604307fd9fe46328369e2896930b8e9427f269db48a62"} Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.091795 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.135157 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv2sd\" (UniqueName: \"kubernetes.io/projected/5378b261-2167-4eae-a672-0dd816a99a18-kube-api-access-lv2sd\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.135541 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-swift-storage-0\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.135573 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.135690 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-config\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.135707 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-sb\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.135810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.147608 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5378b261-2167-4eae-a672-0dd816a99a18-kube-api-access-lv2sd" (OuterVolumeSpecName: "kube-api-access-lv2sd") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18"). InnerVolumeSpecName "kube-api-access-lv2sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.238802 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv2sd\" (UniqueName: \"kubernetes.io/projected/5378b261-2167-4eae-a672-0dd816a99a18-kube-api-access-lv2sd\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.344818 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.345286 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: W0318 15:59:47.346078 4792 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5378b261-2167-4eae-a672-0dd816a99a18/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.346105 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.346452 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.360447 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-config" (OuterVolumeSpecName: "config") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:47 crc kubenswrapper[4792]: E0318 15:59:47.368401 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc podName:5378b261-2167-4eae-a672-0dd816a99a18 nodeName:}" failed. No retries permitted until 2026-03-18 15:59:47.868370418 +0000 UTC m=+1536.737699355 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18") : error deleting /var/lib/kubelet/pods/5378b261-2167-4eae-a672-0dd816a99a18/volume-subpaths: remove /var/lib/kubelet/pods/5378b261-2167-4eae-a672-0dd816a99a18/volume-subpaths: no such file or directory Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.368590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.448126 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.448171 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-config\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.448180 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.448190 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.655820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" event={"ID":"5378b261-2167-4eae-a672-0dd816a99a18","Type":"ContainerDied","Data":"525030698e46ee18d89cda86a3cf3c253dd290a2f6aa53292c7225dd4795c5ac"} Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.655881 4792 scope.go:117] "RemoveContainer" containerID="a795f848d128a13d931b826d0d66939ba57886ba1e016d95571a680f31e40671" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.655930 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8khs2" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.723413 4792 scope.go:117] "RemoveContainer" containerID="613e7bdff76f4f99cbb1c0643480e2a79b2d05f5cca546408dc3b877c1eda530" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.967392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc\") pod \"5378b261-2167-4eae-a672-0dd816a99a18\" (UID: \"5378b261-2167-4eae-a672-0dd816a99a18\") " Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.967912 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5378b261-2167-4eae-a672-0dd816a99a18" (UID: "5378b261-2167-4eae-a672-0dd816a99a18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:59:47 crc kubenswrapper[4792]: I0318 15:59:47.968913 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5378b261-2167-4eae-a672-0dd816a99a18-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.030226 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.206709 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-87c7b6cff-mr5gh"] Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.206999 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-87c7b6cff-mr5gh" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" containerID="cri-o://26573bed61b358994e088ef6e2ebef9b522c88108c71885103c961d5cd280654" gracePeriod=60 Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.229263 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-87c7b6cff-mr5gh" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.232:8004/healthcheck\": EOF" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.229920 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-api-87c7b6cff-mr5gh" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.232:8004/healthcheck\": EOF" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.261196 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7d6bbd8cf5-xbjzv"] Mar 18 15:59:48 crc kubenswrapper[4792]: E0318 15:59:48.266946 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5378b261-2167-4eae-a672-0dd816a99a18" containerName="init" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.267013 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5378b261-2167-4eae-a672-0dd816a99a18" containerName="init" Mar 18 15:59:48 crc kubenswrapper[4792]: E0318 15:59:48.267071 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5378b261-2167-4eae-a672-0dd816a99a18" containerName="dnsmasq-dns" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.267081 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5378b261-2167-4eae-a672-0dd816a99a18" containerName="dnsmasq-dns" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.267339 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5378b261-2167-4eae-a672-0dd816a99a18" containerName="dnsmasq-dns" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.268302 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.282749 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7f48cb748-m6mrx"] Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.292118 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.292186 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.296481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7d6bbd8cf5-xbjzv"] Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.282960 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" containerID="cri-o://9119ca9c98137af4d67b7b0e56211f071e93407c34b0d86e6d14b21e1aaca9b8" gracePeriod=60 Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.309041 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6fbd9cff4c-khq7b"] Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.310808 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.312783 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.230:8000/healthcheck\": EOF" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.312825 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.313160 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.230:8000/healthcheck\": EOF" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.313658 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.366579 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fbd9cff4c-khq7b"] Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data-custom\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384345 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-public-tls-certs\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-combined-ca-bundle\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384470 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data-custom\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-public-tls-certs\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384528 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96z9f\" (UniqueName: \"kubernetes.io/projected/7cbfe80e-2708-4672-aa17-bb5679fdc195-kube-api-access-96z9f\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-internal-tls-certs\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktbx\" (UniqueName: \"kubernetes.io/projected/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-kube-api-access-9ktbx\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384717 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-internal-tls-certs\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.384748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-combined-ca-bundle\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.471013 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8khs2"] Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-internal-tls-certs\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ktbx\" (UniqueName: \"kubernetes.io/projected/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-kube-api-access-9ktbx\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487730 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-internal-tls-certs\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487839 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-combined-ca-bundle\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data-custom\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487875 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-public-tls-certs\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487912 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-combined-ca-bundle\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487954 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.487999 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data-custom\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.488026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-public-tls-certs\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.488044 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96z9f\" (UniqueName: \"kubernetes.io/projected/7cbfe80e-2708-4672-aa17-bb5679fdc195-kube-api-access-96z9f\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.491408 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8khs2"] Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.500702 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-combined-ca-bundle\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.501266 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data-custom\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.501593 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.502262 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-internal-tls-certs\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.502440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-public-tls-certs\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.504013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.504209 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-internal-tls-certs\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.506216 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-public-tls-certs\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.515884 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-combined-ca-bundle\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.515987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96z9f\" (UniqueName: \"kubernetes.io/projected/7cbfe80e-2708-4672-aa17-bb5679fdc195-kube-api-access-96z9f\") pod \"heat-api-7d6bbd8cf5-xbjzv\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.520210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data-custom\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.526060 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ktbx\" (UniqueName: \"kubernetes.io/projected/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-kube-api-access-9ktbx\") pod \"heat-cfnapi-6fbd9cff4c-khq7b\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: E0318 15:59:48.567665 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5378b261_2167_4eae_a672_0dd816a99a18.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5378b261_2167_4eae_a672_0dd816a99a18.slice/crio-525030698e46ee18d89cda86a3cf3c253dd290a2f6aa53292c7225dd4795c5ac\": RecentStats: unable to find data in memory cache]" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.706306 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7846658d4b-mc89v" event={"ID":"3a42eef3-14d6-4b01-b5d0-a0d74399d586","Type":"ContainerStarted","Data":"c1f213d8c2436b441f75a3287428c96920cce5c88de784a9945fe8740ab3f7c3"} Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.707077 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.718661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c99df75d9-tzn29" event={"ID":"db801d44-72e6-44db-a478-e745ecf3d278","Type":"ContainerStarted","Data":"daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453"} Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.719868 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.722856 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.740051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-845984c8c-xtlbs" event={"ID":"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c","Type":"ContainerStarted","Data":"460f7efca4c984cd64dc6086c6d7a36f29c71e001bf6901f6494499f8814bec3"} Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.740391 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.764925 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7846658d4b-mc89v" podStartSLOduration=4.764908053 podStartE2EDuration="4.764908053s" podCreationTimestamp="2026-03-18 15:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:48.733563081 +0000 UTC m=+1537.602892028" watchObservedRunningTime="2026-03-18 15:59:48.764908053 +0000 UTC m=+1537.634236990" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.772623 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.787332 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6c99df75d9-tzn29" podStartSLOduration=4.787307552 podStartE2EDuration="4.787307552s" podCreationTimestamp="2026-03-18 15:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:48.758829711 +0000 UTC m=+1537.628158648" watchObservedRunningTime="2026-03-18 15:59:48.787307552 +0000 UTC m=+1537.656636489" Mar 18 15:59:48 crc kubenswrapper[4792]: I0318 15:59:48.805705 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-845984c8c-xtlbs" podStartSLOduration=4.805681453 podStartE2EDuration="4.805681453s" podCreationTimestamp="2026-03-18 15:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:48.777417119 +0000 UTC m=+1537.646746066" watchObservedRunningTime="2026-03-18 15:59:48.805681453 +0000 UTC m=+1537.675010390" Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.779327 4792 generic.go:334] "Generic (PLEG): container finished" podID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerID="460f7efca4c984cd64dc6086c6d7a36f29c71e001bf6901f6494499f8814bec3" exitCode=1 Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.779431 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-845984c8c-xtlbs" event={"ID":"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c","Type":"ContainerDied","Data":"460f7efca4c984cd64dc6086c6d7a36f29c71e001bf6901f6494499f8814bec3"} Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.780169 4792 scope.go:117] "RemoveContainer" containerID="460f7efca4c984cd64dc6086c6d7a36f29c71e001bf6901f6494499f8814bec3" Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.782435 4792 generic.go:334] "Generic (PLEG): container finished" podID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerID="c1f213d8c2436b441f75a3287428c96920cce5c88de784a9945fe8740ab3f7c3" exitCode=1 Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.783957 4792 scope.go:117] "RemoveContainer" containerID="c1f213d8c2436b441f75a3287428c96920cce5c88de784a9945fe8740ab3f7c3" Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.784263 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7846658d4b-mc89v" event={"ID":"3a42eef3-14d6-4b01-b5d0-a0d74399d586","Type":"ContainerDied","Data":"c1f213d8c2436b441f75a3287428c96920cce5c88de784a9945fe8740ab3f7c3"} Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.886461 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5378b261-2167-4eae-a672-0dd816a99a18" path="/var/lib/kubelet/pods/5378b261-2167-4eae-a672-0dd816a99a18/volumes" Mar 18 15:59:49 crc kubenswrapper[4792]: I0318 15:59:49.988183 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:50 crc kubenswrapper[4792]: I0318 15:59:50.040526 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.618694 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-87c7b6cff-mr5gh" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.232:8004/healthcheck\": read tcp 10.217.0.2:51572->10.217.0.232:8004: read: connection reset by peer" Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.619704 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-87c7b6cff-mr5gh" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.232:8004/healthcheck\": dial tcp 10.217.0.232:8004: connect: connection refused" Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.715951 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.230:8000/healthcheck\": read tcp 10.217.0.2:56404->10.217.0.230:8000: read: connection reset by peer" Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.716840 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.230:8000/healthcheck\": dial tcp 10.217.0.230:8000: connect: connection refused" Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.879791 4792 generic.go:334] "Generic (PLEG): container finished" podID="3fd853fa-1d25-4065-ae01-709ad8473497" containerID="26573bed61b358994e088ef6e2ebef9b522c88108c71885103c961d5cd280654" exitCode=0 Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.879850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-87c7b6cff-mr5gh" event={"ID":"3fd853fa-1d25-4065-ae01-709ad8473497","Type":"ContainerDied","Data":"26573bed61b358994e088ef6e2ebef9b522c88108c71885103c961d5cd280654"} Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.882061 4792 generic.go:334] "Generic (PLEG): container finished" podID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerID="9119ca9c98137af4d67b7b0e56211f071e93407c34b0d86e6d14b21e1aaca9b8" exitCode=0 Mar 18 15:59:54 crc kubenswrapper[4792]: I0318 15:59:54.882104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" event={"ID":"45f058b5-e315-4e43-a124-9bb0be63d5fc","Type":"ContainerDied","Data":"9119ca9c98137af4d67b7b0e56211f071e93407c34b0d86e6d14b21e1aaca9b8"} Mar 18 15:59:55 crc kubenswrapper[4792]: I0318 15:59:55.507461 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 15:59:55 crc kubenswrapper[4792]: I0318 15:59:55.670153 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.230:8000/healthcheck\": dial tcp 10.217.0.230:8000: connect: connection refused" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.316990 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.566474 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-combined-ca-bundle\") pod \"3fd853fa-1d25-4065-ae01-709ad8473497\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.566892 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djq6v\" (UniqueName: \"kubernetes.io/projected/3fd853fa-1d25-4065-ae01-709ad8473497-kube-api-access-djq6v\") pod \"3fd853fa-1d25-4065-ae01-709ad8473497\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.567021 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data\") pod \"3fd853fa-1d25-4065-ae01-709ad8473497\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.567050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data-custom\") pod \"3fd853fa-1d25-4065-ae01-709ad8473497\" (UID: \"3fd853fa-1d25-4065-ae01-709ad8473497\") " Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.594123 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3fd853fa-1d25-4065-ae01-709ad8473497" (UID: "3fd853fa-1d25-4065-ae01-709ad8473497"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.618164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd853fa-1d25-4065-ae01-709ad8473497-kube-api-access-djq6v" (OuterVolumeSpecName: "kube-api-access-djq6v") pod "3fd853fa-1d25-4065-ae01-709ad8473497" (UID: "3fd853fa-1d25-4065-ae01-709ad8473497"). InnerVolumeSpecName "kube-api-access-djq6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.678914 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djq6v\" (UniqueName: \"kubernetes.io/projected/3fd853fa-1d25-4065-ae01-709ad8473497-kube-api-access-djq6v\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.678958 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.775272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fd853fa-1d25-4065-ae01-709ad8473497" (UID: "3fd853fa-1d25-4065-ae01-709ad8473497"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.781945 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.874194 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data" (OuterVolumeSpecName: "config-data") pod "3fd853fa-1d25-4065-ae01-709ad8473497" (UID: "3fd853fa-1d25-4065-ae01-709ad8473497"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.886703 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd853fa-1d25-4065-ae01-709ad8473497-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.930393 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" event={"ID":"45f058b5-e315-4e43-a124-9bb0be63d5fc","Type":"ContainerDied","Data":"10e672eeafc4ce0c5c9b7750ba74ef0c8565810914c1e95e5d4ec34591d66958"} Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.930440 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e672eeafc4ce0c5c9b7750ba74ef0c8565810914c1e95e5d4ec34591d66958" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.933617 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-87c7b6cff-mr5gh" event={"ID":"3fd853fa-1d25-4065-ae01-709ad8473497","Type":"ContainerDied","Data":"429b3d0789029e0b5a534b50c1eba51ae0cf527956e68128cd66ddc76cd9ffb1"} Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.933660 4792 scope.go:117] "RemoveContainer" containerID="26573bed61b358994e088ef6e2ebef9b522c88108c71885103c961d5cd280654" Mar 18 15:59:56 crc kubenswrapper[4792]: I0318 15:59:56.933791 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-87c7b6cff-mr5gh" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.023795 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.067017 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7d6bbd8cf5-xbjzv"] Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.091118 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data-custom\") pod \"45f058b5-e315-4e43-a124-9bb0be63d5fc\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.091166 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn4kx\" (UniqueName: \"kubernetes.io/projected/45f058b5-e315-4e43-a124-9bb0be63d5fc-kube-api-access-jn4kx\") pod \"45f058b5-e315-4e43-a124-9bb0be63d5fc\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.091366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-combined-ca-bundle\") pod \"45f058b5-e315-4e43-a124-9bb0be63d5fc\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.091411 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data\") pod \"45f058b5-e315-4e43-a124-9bb0be63d5fc\" (UID: \"45f058b5-e315-4e43-a124-9bb0be63d5fc\") " Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.097907 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f058b5-e315-4e43-a124-9bb0be63d5fc-kube-api-access-jn4kx" (OuterVolumeSpecName: "kube-api-access-jn4kx") pod "45f058b5-e315-4e43-a124-9bb0be63d5fc" (UID: "45f058b5-e315-4e43-a124-9bb0be63d5fc"). InnerVolumeSpecName "kube-api-access-jn4kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.104585 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-87c7b6cff-mr5gh"] Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.112308 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "45f058b5-e315-4e43-a124-9bb0be63d5fc" (UID: "45f058b5-e315-4e43-a124-9bb0be63d5fc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.116913 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-87c7b6cff-mr5gh"] Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.202704 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.202744 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn4kx\" (UniqueName: \"kubernetes.io/projected/45f058b5-e315-4e43-a124-9bb0be63d5fc-kube-api-access-jn4kx\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.275863 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fbd9cff4c-khq7b"] Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.300209 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45f058b5-e315-4e43-a124-9bb0be63d5fc" (UID: "45f058b5-e315-4e43-a124-9bb0be63d5fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.321902 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.391133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data" (OuterVolumeSpecName: "config-data") pod "45f058b5-e315-4e43-a124-9bb0be63d5fc" (UID: "45f058b5-e315-4e43-a124-9bb0be63d5fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.428932 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f058b5-e315-4e43-a124-9bb0be63d5fc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.907081 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" path="/var/lib/kubelet/pods/3fd853fa-1d25-4065-ae01-709ad8473497/volumes" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.977622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" event={"ID":"29554117-39ac-4a2b-bd31-4d6858fb7931","Type":"ContainerStarted","Data":"9110fc60e1686b32186ba347c797f65e8a8ec417eeb79ae7dabc6926895c80bb"} Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.980398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" event={"ID":"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731","Type":"ContainerStarted","Data":"7d8cc42d45a072a54555d56f0c1d99a4dd15f37a57417d38c220aac34a56b077"} Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.980430 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" event={"ID":"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731","Type":"ContainerStarted","Data":"2e3e3c58b6078cc0280f9bda4afe2961948f26c9bcbe64c3935e0daefc384c4c"} Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.981414 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.986066 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" event={"ID":"7cbfe80e-2708-4672-aa17-bb5679fdc195","Type":"ContainerStarted","Data":"59c4dbb186863be5449bf6e56fcd3c162543a849829d5aa01e0ce0c735299742"} Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.986126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" event={"ID":"7cbfe80e-2708-4672-aa17-bb5679fdc195","Type":"ContainerStarted","Data":"0f1f734df109dff5e2b5c0331a5bcd52e49d5dc4e56a397a68aaceb26db3a1f1"} Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.986321 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.992169 4792 generic.go:334] "Generic (PLEG): container finished" podID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerID="ebf9fca722365e2754a9049f2f634b5df767105dbed5c0442135bede2ab00961" exitCode=1 Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.992466 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-845984c8c-xtlbs" event={"ID":"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c","Type":"ContainerDied","Data":"ebf9fca722365e2754a9049f2f634b5df767105dbed5c0442135bede2ab00961"} Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.992497 4792 scope.go:117] "RemoveContainer" containerID="460f7efca4c984cd64dc6086c6d7a36f29c71e001bf6901f6494499f8814bec3" Mar 18 15:59:57 crc kubenswrapper[4792]: I0318 15:59:57.993484 4792 scope.go:117] "RemoveContainer" containerID="ebf9fca722365e2754a9049f2f634b5df767105dbed5c0442135bede2ab00961" Mar 18 15:59:57 crc kubenswrapper[4792]: E0318 15:59:57.993774 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-845984c8c-xtlbs_openstack(1fb13e2f-b3b8-4485-8d1f-6627ac01b27c)\"" pod="openstack/heat-cfnapi-845984c8c-xtlbs" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.007224 4792 generic.go:334] "Generic (PLEG): container finished" podID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerID="26fc88f11506c71c168efa251e2a3cc901adfca6f83ae02dff6b3cb9d83073a2" exitCode=1 Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.007668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7846658d4b-mc89v" event={"ID":"3a42eef3-14d6-4b01-b5d0-a0d74399d586","Type":"ContainerDied","Data":"26fc88f11506c71c168efa251e2a3cc901adfca6f83ae02dff6b3cb9d83073a2"} Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.008666 4792 scope.go:117] "RemoveContainer" containerID="26fc88f11506c71c168efa251e2a3cc901adfca6f83ae02dff6b3cb9d83073a2" Mar 18 15:59:58 crc kubenswrapper[4792]: E0318 15:59:58.009180 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7846658d4b-mc89v_openstack(3a42eef3-14d6-4b01-b5d0-a0d74399d586)\"" pod="openstack/heat-api-7846658d4b-mc89v" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.017157 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f48cb748-m6mrx" Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.049777 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" podStartSLOduration=3.9806331520000002 podStartE2EDuration="20.049755592s" podCreationTimestamp="2026-03-18 15:59:38 +0000 UTC" firstStartedPulling="2026-03-18 15:59:40.210762003 +0000 UTC m=+1529.080090940" lastFinishedPulling="2026-03-18 15:59:56.279884443 +0000 UTC m=+1545.149213380" observedRunningTime="2026-03-18 15:59:58.014169166 +0000 UTC m=+1546.883498113" watchObservedRunningTime="2026-03-18 15:59:58.049755592 +0000 UTC m=+1546.919084529" Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.109719 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" podStartSLOduration=10.109694348 podStartE2EDuration="10.109694348s" podCreationTimestamp="2026-03-18 15:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:58.034598023 +0000 UTC m=+1546.903926980" watchObservedRunningTime="2026-03-18 15:59:58.109694348 +0000 UTC m=+1546.979023285" Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.110369 4792 scope.go:117] "RemoveContainer" containerID="c1f213d8c2436b441f75a3287428c96920cce5c88de784a9945fe8740ab3f7c3" Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.194378 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" podStartSLOduration=10.194356357 podStartE2EDuration="10.194356357s" podCreationTimestamp="2026-03-18 15:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:59:58.053511091 +0000 UTC m=+1546.922840028" watchObservedRunningTime="2026-03-18 15:59:58.194356357 +0000 UTC m=+1547.063685294" Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.250048 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7f48cb748-m6mrx"] Mar 18 15:59:58 crc kubenswrapper[4792]: I0318 15:59:58.263535 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7f48cb748-m6mrx"] Mar 18 15:59:59 crc kubenswrapper[4792]: I0318 15:59:59.031890 4792 scope.go:117] "RemoveContainer" containerID="26fc88f11506c71c168efa251e2a3cc901adfca6f83ae02dff6b3cb9d83073a2" Mar 18 15:59:59 crc kubenswrapper[4792]: E0318 15:59:59.032170 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7846658d4b-mc89v_openstack(3a42eef3-14d6-4b01-b5d0-a0d74399d586)\"" pod="openstack/heat-api-7846658d4b-mc89v" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" Mar 18 15:59:59 crc kubenswrapper[4792]: I0318 15:59:59.035002 4792 scope.go:117] "RemoveContainer" containerID="ebf9fca722365e2754a9049f2f634b5df767105dbed5c0442135bede2ab00961" Mar 18 15:59:59 crc kubenswrapper[4792]: E0318 15:59:59.035282 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-845984c8c-xtlbs_openstack(1fb13e2f-b3b8-4485-8d1f-6627ac01b27c)\"" pod="openstack/heat-cfnapi-845984c8c-xtlbs" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" Mar 18 15:59:59 crc kubenswrapper[4792]: I0318 15:59:59.881281 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" path="/var/lib/kubelet/pods/45f058b5-e315-4e43-a124-9bb0be63d5fc/volumes" Mar 18 15:59:59 crc kubenswrapper[4792]: I0318 15:59:59.986798 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 15:59:59 crc kubenswrapper[4792]: I0318 15:59:59.986852 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.039145 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.039190 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.046283 4792 scope.go:117] "RemoveContainer" containerID="ebf9fca722365e2754a9049f2f634b5df767105dbed5c0442135bede2ab00961" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.046364 4792 scope.go:117] "RemoveContainer" containerID="26fc88f11506c71c168efa251e2a3cc901adfca6f83ae02dff6b3cb9d83073a2" Mar 18 16:00:00 crc kubenswrapper[4792]: E0318 16:00:00.046658 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-845984c8c-xtlbs_openstack(1fb13e2f-b3b8-4485-8d1f-6627ac01b27c)\"" pod="openstack/heat-cfnapi-845984c8c-xtlbs" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" Mar 18 16:00:00 crc kubenswrapper[4792]: E0318 16:00:00.046678 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7846658d4b-mc89v_openstack(3a42eef3-14d6-4b01-b5d0-a0d74399d586)\"" pod="openstack/heat-api-7846658d4b-mc89v" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.164905 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m"] Mar 18 16:00:00 crc kubenswrapper[4792]: E0318 16:00:00.165663 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.165687 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" Mar 18 16:00:00 crc kubenswrapper[4792]: E0318 16:00:00.165748 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.165758 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.166035 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f058b5-e315-4e43-a124-9bb0be63d5fc" containerName="heat-cfnapi" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.166086 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.167200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.169616 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.170280 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.220932 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e39200-163d-47f3-a6bb-41fb28052c25-secret-volume\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.221133 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhszb\" (UniqueName: \"kubernetes.io/projected/c5e39200-163d-47f3-a6bb-41fb28052c25-kube-api-access-qhszb\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.221276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e39200-163d-47f3-a6bb-41fb28052c25-config-volume\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.234520 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564160-vrzft"] Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.236381 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-vrzft" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.257273 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.257765 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.258009 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.314438 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-vrzft"] Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.322163 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.322267 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.322348 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.324418 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e39200-163d-47f3-a6bb-41fb28052c25-secret-volume\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.324788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhszb\" (UniqueName: \"kubernetes.io/projected/c5e39200-163d-47f3-a6bb-41fb28052c25-kube-api-access-qhszb\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.324823 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.325084 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" gracePeriod=600 Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.325202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvc2g\" (UniqueName: \"kubernetes.io/projected/3a9fab6a-e73c-4238-9f8b-a37af84edd72-kube-api-access-xvc2g\") pod \"auto-csr-approver-29564160-vrzft\" (UID: \"3a9fab6a-e73c-4238-9f8b-a37af84edd72\") " pod="openshift-infra/auto-csr-approver-29564160-vrzft" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.325378 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e39200-163d-47f3-a6bb-41fb28052c25-config-volume\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.326450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e39200-163d-47f3-a6bb-41fb28052c25-config-volume\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.331399 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m"] Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.344961 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e39200-163d-47f3-a6bb-41fb28052c25-secret-volume\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.350740 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhszb\" (UniqueName: \"kubernetes.io/projected/c5e39200-163d-47f3-a6bb-41fb28052c25-kube-api-access-qhszb\") pod \"collect-profiles-29564160-4bs6m\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.428048 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvc2g\" (UniqueName: \"kubernetes.io/projected/3a9fab6a-e73c-4238-9f8b-a37af84edd72-kube-api-access-xvc2g\") pod \"auto-csr-approver-29564160-vrzft\" (UID: \"3a9fab6a-e73c-4238-9f8b-a37af84edd72\") " pod="openshift-infra/auto-csr-approver-29564160-vrzft" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.454469 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvc2g\" (UniqueName: \"kubernetes.io/projected/3a9fab6a-e73c-4238-9f8b-a37af84edd72-kube-api-access-xvc2g\") pod \"auto-csr-approver-29564160-vrzft\" (UID: \"3a9fab6a-e73c-4238-9f8b-a37af84edd72\") " pod="openshift-infra/auto-csr-approver-29564160-vrzft" Mar 18 16:00:00 crc kubenswrapper[4792]: E0318 16:00:00.464062 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.548344 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:00 crc kubenswrapper[4792]: I0318 16:00:00.568719 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-vrzft" Mar 18 16:00:01 crc kubenswrapper[4792]: I0318 16:00:01.059400 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" exitCode=0 Mar 18 16:00:01 crc kubenswrapper[4792]: I0318 16:00:01.059458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e"} Mar 18 16:00:01 crc kubenswrapper[4792]: I0318 16:00:01.059558 4792 scope.go:117] "RemoveContainer" containerID="a93068a48274d54a195ba2b1867063b29cb44c6a806452b2de108e9e08cab78f" Mar 18 16:00:01 crc kubenswrapper[4792]: I0318 16:00:01.060429 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:00:01 crc kubenswrapper[4792]: E0318 16:00:01.060771 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:00:01 crc kubenswrapper[4792]: I0318 16:00:01.192578 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m"] Mar 18 16:00:01 crc kubenswrapper[4792]: I0318 16:00:01.343719 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-vrzft"] Mar 18 16:00:01 crc kubenswrapper[4792]: I0318 16:00:01.363295 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:00:02 crc kubenswrapper[4792]: I0318 16:00:02.074819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-vrzft" event={"ID":"3a9fab6a-e73c-4238-9f8b-a37af84edd72","Type":"ContainerStarted","Data":"cf8b6cd298fc965a44489c77a86b17c60004817ee5014de803e9411acf6a1160"} Mar 18 16:00:02 crc kubenswrapper[4792]: I0318 16:00:02.077029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" event={"ID":"c5e39200-163d-47f3-a6bb-41fb28052c25","Type":"ContainerStarted","Data":"c23569782ba6f840646c7d4a6e0f628778acab1a87490df2c6af2de2027322ac"} Mar 18 16:00:02 crc kubenswrapper[4792]: I0318 16:00:02.077087 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" event={"ID":"c5e39200-163d-47f3-a6bb-41fb28052c25","Type":"ContainerStarted","Data":"84179a1a2b084b5fa7b1c466f018d1d7582ed2d3ac6d31cbde1292f529fb5cb4"} Mar 18 16:00:02 crc kubenswrapper[4792]: I0318 16:00:02.108995 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" podStartSLOduration=2.108953613 podStartE2EDuration="2.108953613s" podCreationTimestamp="2026-03-18 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:02.099014979 +0000 UTC m=+1550.968343926" watchObservedRunningTime="2026-03-18 16:00:02.108953613 +0000 UTC m=+1550.978282550" Mar 18 16:00:03 crc kubenswrapper[4792]: I0318 16:00:03.105100 4792 generic.go:334] "Generic (PLEG): container finished" podID="c5e39200-163d-47f3-a6bb-41fb28052c25" containerID="c23569782ba6f840646c7d4a6e0f628778acab1a87490df2c6af2de2027322ac" exitCode=0 Mar 18 16:00:03 crc kubenswrapper[4792]: I0318 16:00:03.105385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" event={"ID":"c5e39200-163d-47f3-a6bb-41fb28052c25","Type":"ContainerDied","Data":"c23569782ba6f840646c7d4a6e0f628778acab1a87490df2c6af2de2027322ac"} Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.541073 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.541732 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-log" containerID="cri-o://ad5fb530529a4dbccd795cc78a5b9e61b1472a301ede8f4476b67f8fb2bed1ed" gracePeriod=30 Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.541921 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-httpd" containerID="cri-o://f8888569eee935a1c69bd0a3ce7065b1bc1591660f3caa6f971de14a046dee38" gracePeriod=30 Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.785196 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.885143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhszb\" (UniqueName: \"kubernetes.io/projected/c5e39200-163d-47f3-a6bb-41fb28052c25-kube-api-access-qhszb\") pod \"c5e39200-163d-47f3-a6bb-41fb28052c25\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.885446 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e39200-163d-47f3-a6bb-41fb28052c25-config-volume\") pod \"c5e39200-163d-47f3-a6bb-41fb28052c25\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.885603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e39200-163d-47f3-a6bb-41fb28052c25-secret-volume\") pod \"c5e39200-163d-47f3-a6bb-41fb28052c25\" (UID: \"c5e39200-163d-47f3-a6bb-41fb28052c25\") " Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.887638 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e39200-163d-47f3-a6bb-41fb28052c25-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5e39200-163d-47f3-a6bb-41fb28052c25" (UID: "c5e39200-163d-47f3-a6bb-41fb28052c25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.905607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e39200-163d-47f3-a6bb-41fb28052c25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5e39200-163d-47f3-a6bb-41fb28052c25" (UID: "c5e39200-163d-47f3-a6bb-41fb28052c25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.905727 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e39200-163d-47f3-a6bb-41fb28052c25-kube-api-access-qhszb" (OuterVolumeSpecName: "kube-api-access-qhszb") pod "c5e39200-163d-47f3-a6bb-41fb28052c25" (UID: "c5e39200-163d-47f3-a6bb-41fb28052c25"). InnerVolumeSpecName "kube-api-access-qhszb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.988884 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhszb\" (UniqueName: \"kubernetes.io/projected/c5e39200-163d-47f3-a6bb-41fb28052c25-kube-api-access-qhszb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.988928 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5e39200-163d-47f3-a6bb-41fb28052c25-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.988941 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5e39200-163d-47f3-a6bb-41fb28052c25-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:04 crc kubenswrapper[4792]: I0318 16:00:04.997627 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.062310 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-55c78ffbd6-zkvtj"] Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.062612 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-55c78ffbd6-zkvtj" podUID="3019eb4a-4185-4235-97dd-4f3accae8352" containerName="heat-engine" containerID="cri-o://86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" gracePeriod=60 Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.137018 4792 generic.go:334] "Generic (PLEG): container finished" podID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerID="ad5fb530529a4dbccd795cc78a5b9e61b1472a301ede8f4476b67f8fb2bed1ed" exitCode=143 Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.137080 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb","Type":"ContainerDied","Data":"ad5fb530529a4dbccd795cc78a5b9e61b1472a301ede8f4476b67f8fb2bed1ed"} Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.139422 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" event={"ID":"c5e39200-163d-47f3-a6bb-41fb28052c25","Type":"ContainerDied","Data":"84179a1a2b084b5fa7b1c466f018d1d7582ed2d3ac6d31cbde1292f529fb5cb4"} Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.139481 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84179a1a2b084b5fa7b1c466f018d1d7582ed2d3ac6d31cbde1292f529fb5cb4" Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.139546 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m" Mar 18 16:00:05 crc kubenswrapper[4792]: E0318 16:00:05.464933 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:00:05 crc kubenswrapper[4792]: E0318 16:00:05.471499 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:00:05 crc kubenswrapper[4792]: E0318 16:00:05.473766 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:00:05 crc kubenswrapper[4792]: E0318 16:00:05.473851 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55c78ffbd6-zkvtj" podUID="3019eb4a-4185-4235-97dd-4f3accae8352" containerName="heat-engine" Mar 18 16:00:05 crc kubenswrapper[4792]: I0318 16:00:05.941064 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.021852 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-sg-core-conf-yaml\") pod \"a600353b-7ea9-490d-965b-ff93147642c4\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.026126 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-config-data\") pod \"a600353b-7ea9-490d-965b-ff93147642c4\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.026295 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-combined-ca-bundle\") pod \"a600353b-7ea9-490d-965b-ff93147642c4\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.026413 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-run-httpd\") pod \"a600353b-7ea9-490d-965b-ff93147642c4\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.026449 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-log-httpd\") pod \"a600353b-7ea9-490d-965b-ff93147642c4\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.026543 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnt4v\" (UniqueName: \"kubernetes.io/projected/a600353b-7ea9-490d-965b-ff93147642c4-kube-api-access-hnt4v\") pod \"a600353b-7ea9-490d-965b-ff93147642c4\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.026608 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-scripts\") pod \"a600353b-7ea9-490d-965b-ff93147642c4\" (UID: \"a600353b-7ea9-490d-965b-ff93147642c4\") " Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.034468 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a600353b-7ea9-490d-965b-ff93147642c4" (UID: "a600353b-7ea9-490d-965b-ff93147642c4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.036417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a600353b-7ea9-490d-965b-ff93147642c4" (UID: "a600353b-7ea9-490d-965b-ff93147642c4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.042234 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-scripts" (OuterVolumeSpecName: "scripts") pod "a600353b-7ea9-490d-965b-ff93147642c4" (UID: "a600353b-7ea9-490d-965b-ff93147642c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.043575 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a600353b-7ea9-490d-965b-ff93147642c4-kube-api-access-hnt4v" (OuterVolumeSpecName: "kube-api-access-hnt4v") pod "a600353b-7ea9-490d-965b-ff93147642c4" (UID: "a600353b-7ea9-490d-965b-ff93147642c4"). InnerVolumeSpecName "kube-api-access-hnt4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.059602 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-87c7b6cff-mr5gh" podUID="3fd853fa-1d25-4065-ae01-709ad8473497" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.232:8004/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.102078 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a600353b-7ea9-490d-965b-ff93147642c4" (UID: "a600353b-7ea9-490d-965b-ff93147642c4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.130962 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.131027 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a600353b-7ea9-490d-965b-ff93147642c4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.131071 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnt4v\" (UniqueName: \"kubernetes.io/projected/a600353b-7ea9-490d-965b-ff93147642c4-kube-api-access-hnt4v\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.131085 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.131097 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.160713 4792 generic.go:334] "Generic (PLEG): container finished" podID="a600353b-7ea9-490d-965b-ff93147642c4" containerID="f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4" exitCode=137 Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.160759 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerDied","Data":"f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4"} Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.160784 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a600353b-7ea9-490d-965b-ff93147642c4","Type":"ContainerDied","Data":"3ea2d0ed1ed0390cafeeee133e82a1ca5464f1f496941b27bb3125cecfe3c0a3"} Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.160800 4792 scope.go:117] "RemoveContainer" containerID="f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.160899 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.160947 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.165420 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-log" containerID="cri-o://970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0" gracePeriod=30 Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.165634 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-httpd" containerID="cri-o://91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531" gracePeriod=30 Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.269162 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a600353b-7ea9-490d-965b-ff93147642c4" (UID: "a600353b-7ea9-490d-965b-ff93147642c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.287959 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-config-data" (OuterVolumeSpecName: "config-data") pod "a600353b-7ea9-490d-965b-ff93147642c4" (UID: "a600353b-7ea9-490d-965b-ff93147642c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.294070 4792 scope.go:117] "RemoveContainer" containerID="b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.322534 4792 scope.go:117] "RemoveContainer" containerID="74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.335120 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.335161 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a600353b-7ea9-490d-965b-ff93147642c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.353368 4792 scope.go:117] "RemoveContainer" containerID="96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.395832 4792 scope.go:117] "RemoveContainer" containerID="f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.399523 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4\": container with ID starting with f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4 not found: ID does not exist" containerID="f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.399571 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4"} err="failed to get container status \"f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4\": rpc error: code = NotFound desc = could not find container \"f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4\": container with ID starting with f7d7c5acc3c4962c64de1ae44434ec3bee9fb69cbe66c7201014cdd7dbab7ca4 not found: ID does not exist" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.399595 4792 scope.go:117] "RemoveContainer" containerID="b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.400290 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5\": container with ID starting with b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5 not found: ID does not exist" containerID="b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.400335 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5"} err="failed to get container status \"b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5\": rpc error: code = NotFound desc = could not find container \"b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5\": container with ID starting with b2fd22dfaf283390463278a85e112c2fe7e4d28ac614a51c8623a76b2899dce5 not found: ID does not exist" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.400370 4792 scope.go:117] "RemoveContainer" containerID="74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.401039 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f\": container with ID starting with 74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f not found: ID does not exist" containerID="74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.401083 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f"} err="failed to get container status \"74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f\": rpc error: code = NotFound desc = could not find container \"74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f\": container with ID starting with 74524d2929dd615d308392af5ef5bc416eefdbec8eac16caa295127f146efe1f not found: ID does not exist" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.401111 4792 scope.go:117] "RemoveContainer" containerID="96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.401548 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82\": container with ID starting with 96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82 not found: ID does not exist" containerID="96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.401575 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82"} err="failed to get container status \"96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82\": rpc error: code = NotFound desc = could not find container \"96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82\": container with ID starting with 96d89d4fd06edb4a8d1d8ab4c12e535492c9ae9ef5872a28ee851331a8973b82 not found: ID does not exist" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.551179 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.566409 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.598051 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.598613 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-central-agent" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.598640 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-central-agent" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.598692 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e39200-163d-47f3-a6bb-41fb28052c25" containerName="collect-profiles" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.598701 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e39200-163d-47f3-a6bb-41fb28052c25" containerName="collect-profiles" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.598713 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="proxy-httpd" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.598721 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="proxy-httpd" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.598740 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="sg-core" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.598747 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="sg-core" Mar 18 16:00:06 crc kubenswrapper[4792]: E0318 16:00:06.598762 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-notification-agent" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.598769 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-notification-agent" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.599118 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="sg-core" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.599155 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e39200-163d-47f3-a6bb-41fb28052c25" containerName="collect-profiles" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.599175 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="proxy-httpd" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.599188 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-notification-agent" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.599210 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a600353b-7ea9-490d-965b-ff93147642c4" containerName="ceilometer-central-agent" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.606575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.610125 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.610342 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.624289 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.703441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.753376 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.753513 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.753650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-run-httpd\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.753719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psb95\" (UniqueName: \"kubernetes.io/projected/4ba4a9a0-1204-4a89-a584-f4f78990fd48-kube-api-access-psb95\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.753761 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-scripts\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.753787 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-log-httpd\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.753889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-config-data\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.785812 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7846658d4b-mc89v"] Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.856596 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.857357 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.857431 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-run-httpd\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.857472 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psb95\" (UniqueName: \"kubernetes.io/projected/4ba4a9a0-1204-4a89-a584-f4f78990fd48-kube-api-access-psb95\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.857502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-scripts\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.857521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-log-httpd\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.857569 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-config-data\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.858553 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-run-httpd\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.864357 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-log-httpd\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.883361 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-scripts\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.885899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.892322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-config-data\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.898994 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psb95\" (UniqueName: \"kubernetes.io/projected/4ba4a9a0-1204-4a89-a584-f4f78990fd48-kube-api-access-psb95\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.899390 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " pod="openstack/ceilometer-0" Mar 18 16:00:06 crc kubenswrapper[4792]: I0318 16:00:06.937692 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.170287 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.226842 4792 generic.go:334] "Generic (PLEG): container finished" podID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerID="970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0" exitCode=143 Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.226899 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e","Type":"ContainerDied","Data":"970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0"} Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.375507 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-845984c8c-xtlbs"] Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.528322 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.586582 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data-custom\") pod \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.586929 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4wl9\" (UniqueName: \"kubernetes.io/projected/3a42eef3-14d6-4b01-b5d0-a0d74399d586-kube-api-access-m4wl9\") pod \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.587347 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data\") pod \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.587487 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-combined-ca-bundle\") pod \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\" (UID: \"3a42eef3-14d6-4b01-b5d0-a0d74399d586\") " Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.594517 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3a42eef3-14d6-4b01-b5d0-a0d74399d586" (UID: "3a42eef3-14d6-4b01-b5d0-a0d74399d586"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.600122 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a42eef3-14d6-4b01-b5d0-a0d74399d586-kube-api-access-m4wl9" (OuterVolumeSpecName: "kube-api-access-m4wl9") pod "3a42eef3-14d6-4b01-b5d0-a0d74399d586" (UID: "3a42eef3-14d6-4b01-b5d0-a0d74399d586"). InnerVolumeSpecName "kube-api-access-m4wl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.635223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a42eef3-14d6-4b01-b5d0-a0d74399d586" (UID: "3a42eef3-14d6-4b01-b5d0-a0d74399d586"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.668567 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data" (OuterVolumeSpecName: "config-data") pod "3a42eef3-14d6-4b01-b5d0-a0d74399d586" (UID: "3a42eef3-14d6-4b01-b5d0-a0d74399d586"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.691863 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.691911 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4wl9\" (UniqueName: \"kubernetes.io/projected/3a42eef3-14d6-4b01-b5d0-a0d74399d586-kube-api-access-m4wl9\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.691926 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.691940 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a42eef3-14d6-4b01-b5d0-a0d74399d586-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.880780 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a600353b-7ea9-490d-965b-ff93147642c4" path="/var/lib/kubelet/pods/a600353b-7ea9-490d-965b-ff93147642c4/volumes" Mar 18 16:00:07 crc kubenswrapper[4792]: I0318 16:00:07.952762 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.036581 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.205:9292/healthcheck\": read tcp 10.217.0.2:41696->10.217.0.205:9292: read: connection reset by peer" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.036867 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.205:9292/healthcheck\": read tcp 10.217.0.2:41682->10.217.0.205:9292: read: connection reset by peer" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.215851 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.244106 4792 generic.go:334] "Generic (PLEG): container finished" podID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerID="f8888569eee935a1c69bd0a3ce7065b1bc1591660f3caa6f971de14a046dee38" exitCode=0 Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.244181 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb","Type":"ContainerDied","Data":"f8888569eee935a1c69bd0a3ce7065b1bc1591660f3caa6f971de14a046dee38"} Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.249174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerStarted","Data":"f8a80b2e9f8069c1986b2b43f3f4041979aad2e56c0e801a15d0efcbcc0add1b"} Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.253418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-845984c8c-xtlbs" event={"ID":"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c","Type":"ContainerDied","Data":"1537a60b37149d2a0bee3bface44cbc26ed5f0d1215b76c0de7c34b36df22e8e"} Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.253478 4792 scope.go:117] "RemoveContainer" containerID="ebf9fca722365e2754a9049f2f634b5df767105dbed5c0442135bede2ab00961" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.253553 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-845984c8c-xtlbs" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.263871 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7846658d4b-mc89v" event={"ID":"3a42eef3-14d6-4b01-b5d0-a0d74399d586","Type":"ContainerDied","Data":"4585100fe59d009bd6bc88503c28d8d971e676a9085c268256f6ab5fbd446d74"} Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.263991 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7846658d4b-mc89v" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.305688 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data\") pod \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.306056 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-combined-ca-bundle\") pod \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.306115 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2pq5\" (UniqueName: \"kubernetes.io/projected/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-kube-api-access-p2pq5\") pod \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.306303 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data-custom\") pod \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\" (UID: \"1fb13e2f-b3b8-4485-8d1f-6627ac01b27c\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.312937 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7846658d4b-mc89v"] Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.315228 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" (UID: "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.325220 4792 scope.go:117] "RemoveContainer" containerID="26fc88f11506c71c168efa251e2a3cc901adfca6f83ae02dff6b3cb9d83073a2" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.344821 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7846658d4b-mc89v"] Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.352112 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-kube-api-access-p2pq5" (OuterVolumeSpecName: "kube-api-access-p2pq5") pod "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" (UID: "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c"). InnerVolumeSpecName "kube-api-access-p2pq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.392567 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" (UID: "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.402515 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data" (OuterVolumeSpecName: "config-data") pod "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" (UID: "1fb13e2f-b3b8-4485-8d1f-6627ac01b27c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.416394 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.416696 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2pq5\" (UniqueName: \"kubernetes.io/projected/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-kube-api-access-p2pq5\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.416767 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.416851 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.602149 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-845984c8c-xtlbs"] Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.611564 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-845984c8c-xtlbs"] Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.635882 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.726259 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-httpd-run\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.726582 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-logs\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.727123 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.727219 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-logs" (OuterVolumeSpecName: "logs") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.727346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.727657 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-combined-ca-bundle\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.727780 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-public-tls-certs\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.727888 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gtp\" (UniqueName: \"kubernetes.io/projected/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-kube-api-access-c6gtp\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.728045 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-scripts\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.728186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-config-data\") pod \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\" (UID: \"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb\") " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.728846 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.728926 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.734084 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-scripts" (OuterVolumeSpecName: "scripts") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.735049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-kube-api-access-c6gtp" (OuterVolumeSpecName: "kube-api-access-c6gtp") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "kube-api-access-c6gtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.765383 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b" (OuterVolumeSpecName: "glance") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "pvc-8bcf0461-07ee-43b5-b329-fde264578b3b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.790212 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.831849 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.831891 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gtp\" (UniqueName: \"kubernetes.io/projected/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-kube-api-access-c6gtp\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.831901 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.831924 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") on node \"crc\" " Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.887439 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.925888 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.926118 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8bcf0461-07ee-43b5-b329-fde264578b3b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b") on node "crc" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.930263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-config-data" (OuterVolumeSpecName: "config-data") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.936181 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" (UID: "b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.937534 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.937580 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:08 crc kubenswrapper[4792]: I0318 16:00:08.937598 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.288551 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb","Type":"ContainerDied","Data":"f2f2e10b307366e85eb321522a824018c769c9876c759822963d016084516447"} Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.289067 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.289246 4792 scope.go:117] "RemoveContainer" containerID="f8888569eee935a1c69bd0a3ce7065b1bc1591660f3caa6f971de14a046dee38" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.302271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerStarted","Data":"214c25512ab085cac1c1d3b44799a8e7f9b21a549aa7b5e3bdc74e6e3342030a"} Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.367386 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.370419 4792 scope.go:117] "RemoveContainer" containerID="ad5fb530529a4dbccd795cc78a5b9e61b1472a301ede8f4476b67f8fb2bed1ed" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.405485 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.445098 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:09 crc kubenswrapper[4792]: E0318 16:00:09.445689 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerName="heat-api" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.445704 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerName="heat-api" Mar 18 16:00:09 crc kubenswrapper[4792]: E0318 16:00:09.445752 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerName="heat-cfnapi" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.445759 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerName="heat-cfnapi" Mar 18 16:00:09 crc kubenswrapper[4792]: E0318 16:00:09.445783 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-log" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.445790 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-log" Mar 18 16:00:09 crc kubenswrapper[4792]: E0318 16:00:09.445806 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerName="heat-cfnapi" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.445812 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerName="heat-cfnapi" Mar 18 16:00:09 crc kubenswrapper[4792]: E0318 16:00:09.445824 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-httpd" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.445831 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-httpd" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.446074 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-log" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.446107 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" containerName="glance-httpd" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.446163 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerName="heat-api" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.446178 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerName="heat-api" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.446190 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerName="heat-cfnapi" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.446207 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" containerName="heat-cfnapi" Mar 18 16:00:09 crc kubenswrapper[4792]: E0318 16:00:09.446489 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerName="heat-api" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.446501 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" containerName="heat-api" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.447788 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.452810 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.453096 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.466508 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.569013 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.569260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.569394 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.569432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b102e52-1964-4051-b1f3-e066c77b7919-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.583141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b102e52-1964-4051-b1f3-e066c77b7919-logs\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.583300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.583335 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.583362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpf4t\" (UniqueName: \"kubernetes.io/projected/6b102e52-1964-4051-b1f3-e066c77b7919-kube-api-access-lpf4t\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.686599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.686648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.687755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.687804 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b102e52-1964-4051-b1f3-e066c77b7919-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.687990 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b102e52-1964-4051-b1f3-e066c77b7919-logs\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.688073 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.688104 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.688130 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpf4t\" (UniqueName: \"kubernetes.io/projected/6b102e52-1964-4051-b1f3-e066c77b7919-kube-api-access-lpf4t\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.688866 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b102e52-1964-4051-b1f3-e066c77b7919-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.689178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b102e52-1964-4051-b1f3-e066c77b7919-logs\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.709014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.709506 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.709763 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.709823 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f4135ef5c687380422f9124ccce113815e08bdbecc9d37bdfa336b12e119b7ff/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.712859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.713558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b102e52-1964-4051-b1f3-e066c77b7919-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.724660 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpf4t\" (UniqueName: \"kubernetes.io/projected/6b102e52-1964-4051-b1f3-e066c77b7919-kube-api-access-lpf4t\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.834318 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bcf0461-07ee-43b5-b329-fde264578b3b\") pod \"glance-default-external-api-0\" (UID: \"6b102e52-1964-4051-b1f3-e066c77b7919\") " pod="openstack/glance-default-external-api-0" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.885315 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb13e2f-b3b8-4485-8d1f-6627ac01b27c" path="/var/lib/kubelet/pods/1fb13e2f-b3b8-4485-8d1f-6627ac01b27c/volumes" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.886871 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a42eef3-14d6-4b01-b5d0-a0d74399d586" path="/var/lib/kubelet/pods/3a42eef3-14d6-4b01-b5d0-a0d74399d586/volumes" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.887513 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb" path="/var/lib/kubelet/pods/b1f9ef10-a6e9-49ac-bbcd-9786c9e08bdb/volumes" Mar 18 16:00:09 crc kubenswrapper[4792]: I0318 16:00:09.914634 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.281996 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.372471 4792 generic.go:334] "Generic (PLEG): container finished" podID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerID="91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531" exitCode=0 Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.372530 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e","Type":"ContainerDied","Data":"91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531"} Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.372562 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e","Type":"ContainerDied","Data":"b20d86640397a3070507e865b74858fef67c93c85ea8ddb9f09ad594bfab4a85"} Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.372582 4792 scope.go:117] "RemoveContainer" containerID="91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.372588 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.428310 4792 scope.go:117] "RemoveContainer" containerID="970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.443429 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-config-data\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.443488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-combined-ca-bundle\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.443600 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-logs\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.443634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-internal-tls-certs\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.447078 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-logs" (OuterVolumeSpecName: "logs") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.447156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.447265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-httpd-run\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.447301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9bng\" (UniqueName: \"kubernetes.io/projected/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-kube-api-access-x9bng\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.447535 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-scripts\") pod \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\" (UID: \"bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e\") " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.448610 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.452080 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.465654 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-kube-api-access-x9bng" (OuterVolumeSpecName: "kube-api-access-x9bng") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "kube-api-access-x9bng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.481560 4792 scope.go:117] "RemoveContainer" containerID="91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.481660 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-scripts" (OuterVolumeSpecName: "scripts") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: E0318 16:00:10.484222 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531\": container with ID starting with 91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531 not found: ID does not exist" containerID="91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.484285 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531"} err="failed to get container status \"91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531\": rpc error: code = NotFound desc = could not find container \"91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531\": container with ID starting with 91cfbbfd95ae8fb22de36e0eae4830386e9700da07a5bb0fb04793a71346a531 not found: ID does not exist" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.484319 4792 scope.go:117] "RemoveContainer" containerID="970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0" Mar 18 16:00:10 crc kubenswrapper[4792]: E0318 16:00:10.486841 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0\": container with ID starting with 970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0 not found: ID does not exist" containerID="970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.486875 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0"} err="failed to get container status \"970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0\": rpc error: code = NotFound desc = could not find container \"970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0\": container with ID starting with 970fc7348f73499462583979387c80aeea031790480fd6086e864c0435bf46b0 not found: ID does not exist" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.552724 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.552761 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9bng\" (UniqueName: \"kubernetes.io/projected/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-kube-api-access-x9bng\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.552771 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.676464 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5" (OuterVolumeSpecName: "glance") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "pvc-19331b78-34cf-4f66-a604-660df9e579f5". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.743367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.758833 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.758886 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") on node \"crc\" " Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.769257 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.773628 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.779126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-config-data" (OuterVolumeSpecName: "config-data") pod "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" (UID: "bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.809263 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.809468 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-19331b78-34cf-4f66-a604-660df9e579f5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5") on node "crc" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.861259 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.861306 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:10 crc kubenswrapper[4792]: I0318 16:00:10.861322 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.034021 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.057059 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.079847 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:11 crc kubenswrapper[4792]: E0318 16:00:11.080522 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-log" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.080544 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-log" Mar 18 16:00:11 crc kubenswrapper[4792]: E0318 16:00:11.080581 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-httpd" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.080591 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-httpd" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.081015 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-log" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.081045 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" containerName="glance-httpd" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.082773 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.092115 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.092354 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.118513 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.177172 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.177345 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-config-data\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.177518 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.177634 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.177707 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dms4k\" (UniqueName: \"kubernetes.io/projected/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-kube-api-access-dms4k\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.177801 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-logs\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.177886 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.178459 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-scripts\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.281366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.281769 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.281818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dms4k\" (UniqueName: \"kubernetes.io/projected/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-kube-api-access-dms4k\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.283362 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-logs\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.283420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.283455 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-scripts\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.283718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.283795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-config-data\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.287701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.288115 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-logs\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.290284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-scripts\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.290797 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.290832 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7b0cea5cb737a527758633cd5e46883214d92b47b2dcb8de95269a535706d518/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.291849 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.292659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.293814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-config-data\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.304160 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dms4k\" (UniqueName: \"kubernetes.io/projected/419ab4d0-1257-4c7d-89de-2d2ebcee1a74-kube-api-access-dms4k\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.377558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-19331b78-34cf-4f66-a604-660df9e579f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19331b78-34cf-4f66-a604-660df9e579f5\") pod \"glance-default-internal-api-0\" (UID: \"419ab4d0-1257-4c7d-89de-2d2ebcee1a74\") " pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.398511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerStarted","Data":"c03bc7dc43b0e6b9f6cf0ec41c5c1dce60b8293bfddfe846a41d1f0cfc5926d6"} Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.400036 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b102e52-1964-4051-b1f3-e066c77b7919","Type":"ContainerStarted","Data":"1551b58223819683abd793bb50ed68a60a6c3a03ab033200292c7fe8cfa64b55"} Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.425478 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:11 crc kubenswrapper[4792]: I0318 16:00:11.895174 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e" path="/var/lib/kubelet/pods/bc5d6019-e9d3-495d-b0fb-0b5e02a6a78e/volumes" Mar 18 16:00:12 crc kubenswrapper[4792]: I0318 16:00:12.101465 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 16:00:12 crc kubenswrapper[4792]: I0318 16:00:12.420602 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b102e52-1964-4051-b1f3-e066c77b7919","Type":"ContainerStarted","Data":"54ed39a330996cea1d1ce2657532dc84b0f3b7a93d35c2bb4dd7e3a10d7163dd"} Mar 18 16:00:12 crc kubenswrapper[4792]: I0318 16:00:12.422649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"419ab4d0-1257-4c7d-89de-2d2ebcee1a74","Type":"ContainerStarted","Data":"bdefaafd39ba6774544b58854863b9fc582edb45397daf2f4b466f9ffdb24174"} Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.364252 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.448024 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b102e52-1964-4051-b1f3-e066c77b7919","Type":"ContainerStarted","Data":"340b5a795ee4a46540b77c1edf1dff1ca4a4c034e5198e0846935053136cd717"} Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.449328 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-combined-ca-bundle\") pod \"3019eb4a-4185-4235-97dd-4f3accae8352\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.449580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvfpb\" (UniqueName: \"kubernetes.io/projected/3019eb4a-4185-4235-97dd-4f3accae8352-kube-api-access-cvfpb\") pod \"3019eb4a-4185-4235-97dd-4f3accae8352\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.449644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data-custom\") pod \"3019eb4a-4185-4235-97dd-4f3accae8352\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.449698 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data\") pod \"3019eb4a-4185-4235-97dd-4f3accae8352\" (UID: \"3019eb4a-4185-4235-97dd-4f3accae8352\") " Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.452373 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"419ab4d0-1257-4c7d-89de-2d2ebcee1a74","Type":"ContainerStarted","Data":"0ad0dfc5ad5462a5f896d0cfca8589a3761f14890e3e069debd503e132f560b9"} Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.455502 4792 generic.go:334] "Generic (PLEG): container finished" podID="3019eb4a-4185-4235-97dd-4f3accae8352" containerID="86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" exitCode=0 Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.455612 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55c78ffbd6-zkvtj" event={"ID":"3019eb4a-4185-4235-97dd-4f3accae8352","Type":"ContainerDied","Data":"86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552"} Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.455651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55c78ffbd6-zkvtj" event={"ID":"3019eb4a-4185-4235-97dd-4f3accae8352","Type":"ContainerDied","Data":"76b0c8e62f06cc0b4012175ff2bd5e2ade62041febacebecdadf5f4f3cc404f3"} Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.455668 4792 scope.go:117] "RemoveContainer" containerID="86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.455845 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55c78ffbd6-zkvtj" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.456397 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3019eb4a-4185-4235-97dd-4f3accae8352-kube-api-access-cvfpb" (OuterVolumeSpecName: "kube-api-access-cvfpb") pod "3019eb4a-4185-4235-97dd-4f3accae8352" (UID: "3019eb4a-4185-4235-97dd-4f3accae8352"). InnerVolumeSpecName "kube-api-access-cvfpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.457537 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3019eb4a-4185-4235-97dd-4f3accae8352" (UID: "3019eb4a-4185-4235-97dd-4f3accae8352"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.468140 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerStarted","Data":"56aba528cd03eb28f154db67cf96d36ff4173f8b208479fbbb391050626faf78"} Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.480300 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.480275391 podStartE2EDuration="4.480275391s" podCreationTimestamp="2026-03-18 16:00:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:13.471394727 +0000 UTC m=+1562.340723674" watchObservedRunningTime="2026-03-18 16:00:13.480275391 +0000 UTC m=+1562.349604328" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.512449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3019eb4a-4185-4235-97dd-4f3accae8352" (UID: "3019eb4a-4185-4235-97dd-4f3accae8352"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.533365 4792 scope.go:117] "RemoveContainer" containerID="86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" Mar 18 16:00:13 crc kubenswrapper[4792]: E0318 16:00:13.533867 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552\": container with ID starting with 86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552 not found: ID does not exist" containerID="86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.533905 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552"} err="failed to get container status \"86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552\": rpc error: code = NotFound desc = could not find container \"86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552\": container with ID starting with 86ec9728c6890442c6e94438601a398bbd89bf8d2246e26257daf977efa88552 not found: ID does not exist" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.542120 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data" (OuterVolumeSpecName: "config-data") pod "3019eb4a-4185-4235-97dd-4f3accae8352" (UID: "3019eb4a-4185-4235-97dd-4f3accae8352"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.553426 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvfpb\" (UniqueName: \"kubernetes.io/projected/3019eb4a-4185-4235-97dd-4f3accae8352-kube-api-access-cvfpb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.553464 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.553473 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.553483 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3019eb4a-4185-4235-97dd-4f3accae8352-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.810793 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-55c78ffbd6-zkvtj"] Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.826055 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-55c78ffbd6-zkvtj"] Mar 18 16:00:13 crc kubenswrapper[4792]: I0318 16:00:13.889025 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3019eb4a-4185-4235-97dd-4f3accae8352" path="/var/lib/kubelet/pods/3019eb4a-4185-4235-97dd-4f3accae8352/volumes" Mar 18 16:00:14 crc kubenswrapper[4792]: I0318 16:00:14.499596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"419ab4d0-1257-4c7d-89de-2d2ebcee1a74","Type":"ContainerStarted","Data":"461f3e33d0320ff562503b09773a99d1d412f71d2fd9b3065aca3feccce53d91"} Mar 18 16:00:14 crc kubenswrapper[4792]: I0318 16:00:14.507364 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-vrzft" event={"ID":"3a9fab6a-e73c-4238-9f8b-a37af84edd72","Type":"ContainerStarted","Data":"6a93449c805557502ad71f976f0fa2e7e80e09a70ca97dab75d47cb15d1ce211"} Mar 18 16:00:14 crc kubenswrapper[4792]: I0318 16:00:14.555405 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.555382977 podStartE2EDuration="3.555382977s" podCreationTimestamp="2026-03-18 16:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:14.526851775 +0000 UTC m=+1563.396180712" watchObservedRunningTime="2026-03-18 16:00:14.555382977 +0000 UTC m=+1563.424711914" Mar 18 16:00:14 crc kubenswrapper[4792]: I0318 16:00:14.562630 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564160-vrzft" podStartSLOduration=2.304457126 podStartE2EDuration="14.562608111s" podCreationTimestamp="2026-03-18 16:00:00 +0000 UTC" firstStartedPulling="2026-03-18 16:00:01.3629532 +0000 UTC m=+1550.232282137" lastFinishedPulling="2026-03-18 16:00:13.621104185 +0000 UTC m=+1562.490433122" observedRunningTime="2026-03-18 16:00:14.544081857 +0000 UTC m=+1563.413410794" watchObservedRunningTime="2026-03-18 16:00:14.562608111 +0000 UTC m=+1563.431937048" Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.527607 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-central-agent" containerID="cri-o://214c25512ab085cac1c1d3b44799a8e7f9b21a549aa7b5e3bdc74e6e3342030a" gracePeriod=30 Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.528639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerStarted","Data":"970f5a26a22657fbae38b55e0c0b969cd0faac42e0d707663083ebe614d6c790"} Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.529438 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.529904 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="proxy-httpd" containerID="cri-o://970f5a26a22657fbae38b55e0c0b969cd0faac42e0d707663083ebe614d6c790" gracePeriod=30 Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.530830 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="sg-core" containerID="cri-o://56aba528cd03eb28f154db67cf96d36ff4173f8b208479fbbb391050626faf78" gracePeriod=30 Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.530901 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-notification-agent" containerID="cri-o://c03bc7dc43b0e6b9f6cf0ec41c5c1dce60b8293bfddfe846a41d1f0cfc5926d6" gracePeriod=30 Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.565518 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.865656197 podStartE2EDuration="9.565492064s" podCreationTimestamp="2026-03-18 16:00:06 +0000 UTC" firstStartedPulling="2026-03-18 16:00:07.956194501 +0000 UTC m=+1556.825523438" lastFinishedPulling="2026-03-18 16:00:14.656030368 +0000 UTC m=+1563.525359305" observedRunningTime="2026-03-18 16:00:15.552115359 +0000 UTC m=+1564.421444296" watchObservedRunningTime="2026-03-18 16:00:15.565492064 +0000 UTC m=+1564.434821011" Mar 18 16:00:15 crc kubenswrapper[4792]: I0318 16:00:15.854905 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:00:15 crc kubenswrapper[4792]: E0318 16:00:15.855713 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.541799 4792 generic.go:334] "Generic (PLEG): container finished" podID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerID="970f5a26a22657fbae38b55e0c0b969cd0faac42e0d707663083ebe614d6c790" exitCode=0 Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.541834 4792 generic.go:334] "Generic (PLEG): container finished" podID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerID="56aba528cd03eb28f154db67cf96d36ff4173f8b208479fbbb391050626faf78" exitCode=2 Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.541853 4792 generic.go:334] "Generic (PLEG): container finished" podID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerID="c03bc7dc43b0e6b9f6cf0ec41c5c1dce60b8293bfddfe846a41d1f0cfc5926d6" exitCode=0 Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.541887 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerDied","Data":"970f5a26a22657fbae38b55e0c0b969cd0faac42e0d707663083ebe614d6c790"} Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.541913 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerDied","Data":"56aba528cd03eb28f154db67cf96d36ff4173f8b208479fbbb391050626faf78"} Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.541922 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerDied","Data":"c03bc7dc43b0e6b9f6cf0ec41c5c1dce60b8293bfddfe846a41d1f0cfc5926d6"} Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.543952 4792 generic.go:334] "Generic (PLEG): container finished" podID="29554117-39ac-4a2b-bd31-4d6858fb7931" containerID="9110fc60e1686b32186ba347c797f65e8a8ec417eeb79ae7dabc6926895c80bb" exitCode=0 Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.544022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" event={"ID":"29554117-39ac-4a2b-bd31-4d6858fb7931","Type":"ContainerDied","Data":"9110fc60e1686b32186ba347c797f65e8a8ec417eeb79ae7dabc6926895c80bb"} Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.550438 4792 generic.go:334] "Generic (PLEG): container finished" podID="3a9fab6a-e73c-4238-9f8b-a37af84edd72" containerID="6a93449c805557502ad71f976f0fa2e7e80e09a70ca97dab75d47cb15d1ce211" exitCode=0 Mar 18 16:00:16 crc kubenswrapper[4792]: I0318 16:00:16.550496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-vrzft" event={"ID":"3a9fab6a-e73c-4238-9f8b-a37af84edd72","Type":"ContainerDied","Data":"6a93449c805557502ad71f976f0fa2e7e80e09a70ca97dab75d47cb15d1ce211"} Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.174276 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.182562 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-vrzft" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.274838 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-combined-ca-bundle\") pod \"29554117-39ac-4a2b-bd31-4d6858fb7931\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.274898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-config-data\") pod \"29554117-39ac-4a2b-bd31-4d6858fb7931\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.274957 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gffn6\" (UniqueName: \"kubernetes.io/projected/29554117-39ac-4a2b-bd31-4d6858fb7931-kube-api-access-gffn6\") pod \"29554117-39ac-4a2b-bd31-4d6858fb7931\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.275023 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvc2g\" (UniqueName: \"kubernetes.io/projected/3a9fab6a-e73c-4238-9f8b-a37af84edd72-kube-api-access-xvc2g\") pod \"3a9fab6a-e73c-4238-9f8b-a37af84edd72\" (UID: \"3a9fab6a-e73c-4238-9f8b-a37af84edd72\") " Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.275323 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-scripts\") pod \"29554117-39ac-4a2b-bd31-4d6858fb7931\" (UID: \"29554117-39ac-4a2b-bd31-4d6858fb7931\") " Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.283261 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9fab6a-e73c-4238-9f8b-a37af84edd72-kube-api-access-xvc2g" (OuterVolumeSpecName: "kube-api-access-xvc2g") pod "3a9fab6a-e73c-4238-9f8b-a37af84edd72" (UID: "3a9fab6a-e73c-4238-9f8b-a37af84edd72"). InnerVolumeSpecName "kube-api-access-xvc2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.283338 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29554117-39ac-4a2b-bd31-4d6858fb7931-kube-api-access-gffn6" (OuterVolumeSpecName: "kube-api-access-gffn6") pod "29554117-39ac-4a2b-bd31-4d6858fb7931" (UID: "29554117-39ac-4a2b-bd31-4d6858fb7931"). InnerVolumeSpecName "kube-api-access-gffn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.284966 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-scripts" (OuterVolumeSpecName: "scripts") pod "29554117-39ac-4a2b-bd31-4d6858fb7931" (UID: "29554117-39ac-4a2b-bd31-4d6858fb7931"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.319196 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-config-data" (OuterVolumeSpecName: "config-data") pod "29554117-39ac-4a2b-bd31-4d6858fb7931" (UID: "29554117-39ac-4a2b-bd31-4d6858fb7931"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.319248 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29554117-39ac-4a2b-bd31-4d6858fb7931" (UID: "29554117-39ac-4a2b-bd31-4d6858fb7931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.381058 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.381128 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.381140 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29554117-39ac-4a2b-bd31-4d6858fb7931-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.381152 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gffn6\" (UniqueName: \"kubernetes.io/projected/29554117-39ac-4a2b-bd31-4d6858fb7931-kube-api-access-gffn6\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.381183 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvc2g\" (UniqueName: \"kubernetes.io/projected/3a9fab6a-e73c-4238-9f8b-a37af84edd72-kube-api-access-xvc2g\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.581565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" event={"ID":"29554117-39ac-4a2b-bd31-4d6858fb7931","Type":"ContainerDied","Data":"ebfce030f3fe73c3648168536a986fbaa7f8149f8b8bf9734b6cc0b145786817"} Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.581613 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebfce030f3fe73c3648168536a986fbaa7f8149f8b8bf9734b6cc0b145786817" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.581700 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jjzdm" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.587743 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564160-vrzft" event={"ID":"3a9fab6a-e73c-4238-9f8b-a37af84edd72","Type":"ContainerDied","Data":"cf8b6cd298fc965a44489c77a86b17c60004817ee5014de803e9411acf6a1160"} Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.587797 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf8b6cd298fc965a44489c77a86b17c60004817ee5014de803e9411acf6a1160" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.587868 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564160-vrzft" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.754965 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:00:18 crc kubenswrapper[4792]: E0318 16:00:18.755523 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9fab6a-e73c-4238-9f8b-a37af84edd72" containerName="oc" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.755540 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9fab6a-e73c-4238-9f8b-a37af84edd72" containerName="oc" Mar 18 16:00:18 crc kubenswrapper[4792]: E0318 16:00:18.755589 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29554117-39ac-4a2b-bd31-4d6858fb7931" containerName="nova-cell0-conductor-db-sync" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.755598 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29554117-39ac-4a2b-bd31-4d6858fb7931" containerName="nova-cell0-conductor-db-sync" Mar 18 16:00:18 crc kubenswrapper[4792]: E0318 16:00:18.755614 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3019eb4a-4185-4235-97dd-4f3accae8352" containerName="heat-engine" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.755623 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3019eb4a-4185-4235-97dd-4f3accae8352" containerName="heat-engine" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.755920 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9fab6a-e73c-4238-9f8b-a37af84edd72" containerName="oc" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.755948 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3019eb4a-4185-4235-97dd-4f3accae8352" containerName="heat-engine" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.755966 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29554117-39ac-4a2b-bd31-4d6858fb7931" containerName="nova-cell0-conductor-db-sync" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.757132 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.766166 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d6kvz" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.766187 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.773682 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-zvm46"] Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.802885 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564154-zvm46"] Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.813838 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.942304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0326f82-a981-420a-be10-4364a620bdfd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.942807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0326f82-a981-420a-be10-4364a620bdfd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:18 crc kubenswrapper[4792]: I0318 16:00:18.943078 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwtps\" (UniqueName: \"kubernetes.io/projected/f0326f82-a981-420a-be10-4364a620bdfd-kube-api-access-wwtps\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.045633 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwtps\" (UniqueName: \"kubernetes.io/projected/f0326f82-a981-420a-be10-4364a620bdfd-kube-api-access-wwtps\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.045738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0326f82-a981-420a-be10-4364a620bdfd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.045853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0326f82-a981-420a-be10-4364a620bdfd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.050691 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0326f82-a981-420a-be10-4364a620bdfd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.055641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0326f82-a981-420a-be10-4364a620bdfd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.068946 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwtps\" (UniqueName: \"kubernetes.io/projected/f0326f82-a981-420a-be10-4364a620bdfd-kube-api-access-wwtps\") pod \"nova-cell0-conductor-0\" (UID: \"f0326f82-a981-420a-be10-4364a620bdfd\") " pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.094886 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.880075 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2520256f-dcf3-41f8-9dcf-73e8963a5ae0" path="/var/lib/kubelet/pods/2520256f-dcf3-41f8-9dcf-73e8963a5ae0/volumes" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.917933 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 16:00:19 crc kubenswrapper[4792]: I0318 16:00:19.918910 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:19.997443 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.023075 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.321157 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.887944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f0326f82-a981-420a-be10-4364a620bdfd","Type":"ContainerStarted","Data":"ac1d77dcd9442f363043a7bd1a65c834283352dcbd400c25ae4727fa31649aff"} Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.889202 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f0326f82-a981-420a-be10-4364a620bdfd","Type":"ContainerStarted","Data":"373947cde6dcc329754bfa7edf596ca85c541646f5a5c153e82e61ad7ab02d0c"} Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.889526 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.890037 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.890129 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:20 crc kubenswrapper[4792]: I0318 16:00:20.917082 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.917061491 podStartE2EDuration="2.917061491s" podCreationTimestamp="2026-03-18 16:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:20.911116047 +0000 UTC m=+1569.780444984" watchObservedRunningTime="2026-03-18 16:00:20.917061491 +0000 UTC m=+1569.786390428" Mar 18 16:00:21 crc kubenswrapper[4792]: I0318 16:00:21.427188 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:21 crc kubenswrapper[4792]: I0318 16:00:21.427608 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:21 crc kubenswrapper[4792]: I0318 16:00:21.490222 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:21 crc kubenswrapper[4792]: I0318 16:00:21.508888 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:21 crc kubenswrapper[4792]: I0318 16:00:21.902075 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:21 crc kubenswrapper[4792]: I0318 16:00:21.902116 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:24 crc kubenswrapper[4792]: I0318 16:00:24.265991 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:24 crc kubenswrapper[4792]: I0318 16:00:24.266382 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:00:24 crc kubenswrapper[4792]: I0318 16:00:24.290601 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 16:00:24 crc kubenswrapper[4792]: I0318 16:00:24.307830 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 16:00:24 crc kubenswrapper[4792]: I0318 16:00:24.308010 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:00:24 crc kubenswrapper[4792]: I0318 16:00:24.395651 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 16:00:27 crc kubenswrapper[4792]: I0318 16:00:27.976388 4792 generic.go:334] "Generic (PLEG): container finished" podID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerID="214c25512ab085cac1c1d3b44799a8e7f9b21a549aa7b5e3bdc74e6e3342030a" exitCode=0 Mar 18 16:00:27 crc kubenswrapper[4792]: I0318 16:00:27.976492 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerDied","Data":"214c25512ab085cac1c1d3b44799a8e7f9b21a549aa7b5e3bdc74e6e3342030a"} Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.141642 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.304289 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-scripts\") pod \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.304481 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-log-httpd\") pod \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.304511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psb95\" (UniqueName: \"kubernetes.io/projected/4ba4a9a0-1204-4a89-a584-f4f78990fd48-kube-api-access-psb95\") pod \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.304623 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-run-httpd\") pod \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.304652 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-combined-ca-bundle\") pod \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.304749 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-config-data\") pod \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.304817 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-sg-core-conf-yaml\") pod \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\" (UID: \"4ba4a9a0-1204-4a89-a584-f4f78990fd48\") " Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.305075 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4ba4a9a0-1204-4a89-a584-f4f78990fd48" (UID: "4ba4a9a0-1204-4a89-a584-f4f78990fd48"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.305205 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4ba4a9a0-1204-4a89-a584-f4f78990fd48" (UID: "4ba4a9a0-1204-4a89-a584-f4f78990fd48"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.305690 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.305716 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ba4a9a0-1204-4a89-a584-f4f78990fd48-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.317258 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-scripts" (OuterVolumeSpecName: "scripts") pod "4ba4a9a0-1204-4a89-a584-f4f78990fd48" (UID: "4ba4a9a0-1204-4a89-a584-f4f78990fd48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.317517 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba4a9a0-1204-4a89-a584-f4f78990fd48-kube-api-access-psb95" (OuterVolumeSpecName: "kube-api-access-psb95") pod "4ba4a9a0-1204-4a89-a584-f4f78990fd48" (UID: "4ba4a9a0-1204-4a89-a584-f4f78990fd48"). InnerVolumeSpecName "kube-api-access-psb95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.356652 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4ba4a9a0-1204-4a89-a584-f4f78990fd48" (UID: "4ba4a9a0-1204-4a89-a584-f4f78990fd48"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.357453 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-92bs2"] Mar 18 16:00:28 crc kubenswrapper[4792]: E0318 16:00:28.358177 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-notification-agent" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358202 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-notification-agent" Mar 18 16:00:28 crc kubenswrapper[4792]: E0318 16:00:28.358247 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="sg-core" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358256 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="sg-core" Mar 18 16:00:28 crc kubenswrapper[4792]: E0318 16:00:28.358283 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-central-agent" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358292 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-central-agent" Mar 18 16:00:28 crc kubenswrapper[4792]: E0318 16:00:28.358304 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="proxy-httpd" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358311 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="proxy-httpd" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358604 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="sg-core" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358633 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="proxy-httpd" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358655 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-central-agent" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.358675 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" containerName="ceilometer-notification-agent" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.361516 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.394050 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92bs2"] Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.408567 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.408612 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.408625 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psb95\" (UniqueName: \"kubernetes.io/projected/4ba4a9a0-1204-4a89-a584-f4f78990fd48-kube-api-access-psb95\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.448050 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ba4a9a0-1204-4a89-a584-f4f78990fd48" (UID: "4ba4a9a0-1204-4a89-a584-f4f78990fd48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.491984 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-config-data" (OuterVolumeSpecName: "config-data") pod "4ba4a9a0-1204-4a89-a584-f4f78990fd48" (UID: "4ba4a9a0-1204-4a89-a584-f4f78990fd48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.510935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-catalog-content\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.510993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2llk\" (UniqueName: \"kubernetes.io/projected/17418767-af15-46e0-b37e-0c1d8102a2e6-kube-api-access-p2llk\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.511250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-utilities\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.511611 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.511628 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ba4a9a0-1204-4a89-a584-f4f78990fd48-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.613460 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-catalog-content\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.613512 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2llk\" (UniqueName: \"kubernetes.io/projected/17418767-af15-46e0-b37e-0c1d8102a2e6-kube-api-access-p2llk\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.613673 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-utilities\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.614296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-utilities\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.614413 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-catalog-content\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.636207 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2llk\" (UniqueName: \"kubernetes.io/projected/17418767-af15-46e0-b37e-0c1d8102a2e6-kube-api-access-p2llk\") pod \"redhat-marketplace-92bs2\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.814424 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:28 crc kubenswrapper[4792]: I0318 16:00:28.854536 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:00:28 crc kubenswrapper[4792]: E0318 16:00:28.854922 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.001306 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ba4a9a0-1204-4a89-a584-f4f78990fd48","Type":"ContainerDied","Data":"f8a80b2e9f8069c1986b2b43f3f4041979aad2e56c0e801a15d0efcbcc0add1b"} Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.001634 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.001649 4792 scope.go:117] "RemoveContainer" containerID="970f5a26a22657fbae38b55e0c0b969cd0faac42e0d707663083ebe614d6c790" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.045134 4792 scope.go:117] "RemoveContainer" containerID="56aba528cd03eb28f154db67cf96d36ff4173f8b208479fbbb391050626faf78" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.051702 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.075865 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.097551 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.102365 4792 scope.go:117] "RemoveContainer" containerID="c03bc7dc43b0e6b9f6cf0ec41c5c1dce60b8293bfddfe846a41d1f0cfc5926d6" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.111021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.112325 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.115046 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.115285 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.150812 4792 scope.go:117] "RemoveContainer" containerID="214c25512ab085cac1c1d3b44799a8e7f9b21a549aa7b5e3bdc74e6e3342030a" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.153508 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.228799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.228882 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-config-data\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.228979 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.229059 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-run-httpd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.229077 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbmvd\" (UniqueName: \"kubernetes.io/projected/c58cd8cc-5417-43bf-b198-b14699bef942-kube-api-access-nbmvd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.229246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-log-httpd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.229309 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-scripts\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.331803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-log-httpd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.331879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-scripts\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.331949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.331986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-config-data\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.332044 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.332102 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-run-httpd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.332130 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbmvd\" (UniqueName: \"kubernetes.io/projected/c58cd8cc-5417-43bf-b198-b14699bef942-kube-api-access-nbmvd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.332924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-run-httpd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.333156 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-log-httpd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.338046 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.339514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-config-data\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.340173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-scripts\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.343053 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.351709 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbmvd\" (UniqueName: \"kubernetes.io/projected/c58cd8cc-5417-43bf-b198-b14699bef942-kube-api-access-nbmvd\") pod \"ceilometer-0\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: W0318 16:00:29.419195 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46 WatchSource:0}: Error finding container 0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46: Status 404 returned error can't find the container with id 0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46 Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.419989 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92bs2"] Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.441546 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.817006 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-d42vd"] Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.818896 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.821324 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.821616 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.830655 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-d42vd"] Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.941101 4792 scope.go:117] "RemoveContainer" containerID="6dffd13a15c195a01c61ba4fddecd3d3d2c2f87ae6f586556c6a60b307ffc030" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.942148 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba4a9a0-1204-4a89-a584-f4f78990fd48" path="/var/lib/kubelet/pods/4ba4a9a0-1204-4a89-a584-f4f78990fd48/volumes" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.959659 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-scripts\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.959766 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbxmc\" (UniqueName: \"kubernetes.io/projected/5b243f42-c71b-4c56-817e-2345f5502ba6-kube-api-access-hbxmc\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.959827 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:29 crc kubenswrapper[4792]: I0318 16:00:29.959896 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-config-data\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.028868 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.031879 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.040763 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.068431 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-scripts\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.068542 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbxmc\" (UniqueName: \"kubernetes.io/projected/5b243f42-c71b-4c56-817e-2345f5502ba6-kube-api-access-hbxmc\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.068600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.068655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-config-data\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.076851 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.110909 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.111463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-config-data\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.119151 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-scripts\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.123659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbxmc\" (UniqueName: \"kubernetes.io/projected/5b243f42-c71b-4c56-817e-2345f5502ba6-kube-api-access-hbxmc\") pod \"nova-cell0-cell-mapping-d42vd\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.139813 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.152726 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.163623 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.165691 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.168122 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.173670 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.173724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk5rw\" (UniqueName: \"kubernetes.io/projected/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-kube-api-access-kk5rw\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.173946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-config-data\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.174002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-logs\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.181453 4792 generic.go:334] "Generic (PLEG): container finished" podID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerID="6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade" exitCode=0 Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.181596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92bs2" event={"ID":"17418767-af15-46e0-b37e-0c1d8102a2e6","Type":"ContainerDied","Data":"6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade"} Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.181637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92bs2" event={"ID":"17418767-af15-46e0-b37e-0c1d8102a2e6","Type":"ContainerStarted","Data":"0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46"} Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.280554 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-config-data\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.280655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-logs\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.280754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-config-data\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.280843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2r9\" (UniqueName: \"kubernetes.io/projected/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-kube-api-access-xz2r9\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.280988 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.281027 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk5rw\" (UniqueName: \"kubernetes.io/projected/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-kube-api-access-kk5rw\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.281211 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.282110 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-logs\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.288349 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.300918 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-config-data\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.351070 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.390213 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-config-data\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.390568 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2r9\" (UniqueName: \"kubernetes.io/projected/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-kube-api-access-xz2r9\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.390873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.394998 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.398288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk5rw\" (UniqueName: \"kubernetes.io/projected/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-kube-api-access-kk5rw\") pod \"nova-api-0\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.405257 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-config-data\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.502539 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2r9\" (UniqueName: \"kubernetes.io/projected/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-kube-api-access-xz2r9\") pod \"nova-scheduler-0\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.659058 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.661814 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.672463 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.683285 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.706215 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.711691 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.753297 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.755437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.764553 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.770103 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.801473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.801541 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxq96\" (UniqueName: \"kubernetes.io/projected/e8325ad2-da8e-4a5a-b759-57a468b289b7-kube-api-access-vxq96\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.801608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-config-data\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.801690 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drrlt\" (UniqueName: \"kubernetes.io/projected/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-kube-api-access-drrlt\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.801741 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-logs\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.801765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.801792 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.810125 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-mpgvp"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.814380 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.859281 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-mpgvp"] Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.904608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-config-data\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.904766 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.904852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drrlt\" (UniqueName: \"kubernetes.io/projected/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-kube-api-access-drrlt\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.905483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-logs\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.905544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.905611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.906112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrrgz\" (UniqueName: \"kubernetes.io/projected/1372258e-0814-4692-9694-7dc28a519871-kube-api-access-jrrgz\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.906516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.906619 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.906754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.906929 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-config\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.906998 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.907088 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxq96\" (UniqueName: \"kubernetes.io/projected/e8325ad2-da8e-4a5a-b759-57a468b289b7-kube-api-access-vxq96\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.908484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-logs\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.921732 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-config-data\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.929462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.929649 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.931596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.959915 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drrlt\" (UniqueName: \"kubernetes.io/projected/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-kube-api-access-drrlt\") pod \"nova-metadata-0\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " pod="openstack/nova-metadata-0" Mar 18 16:00:30 crc kubenswrapper[4792]: I0318 16:00:30.969432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxq96\" (UniqueName: \"kubernetes.io/projected/e8325ad2-da8e-4a5a-b759-57a468b289b7-kube-api-access-vxq96\") pod \"nova-cell1-novncproxy-0\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.010466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.010546 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.010634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-config\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.010737 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.010883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrrgz\" (UniqueName: \"kubernetes.io/projected/1372258e-0814-4692-9694-7dc28a519871-kube-api-access-jrrgz\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.010951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.013927 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.014195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-config\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.014797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.015283 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.020717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.056150 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrrgz\" (UniqueName: \"kubernetes.io/projected/1372258e-0814-4692-9694-7dc28a519871-kube-api-access-jrrgz\") pod \"dnsmasq-dns-568d7fd7cf-mpgvp\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.194294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.259313 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.273025 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.300290 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerStarted","Data":"88761fb8a681082a84162a0659938070d4644ff1ba917fc786e7ba124951fd5d"} Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.663863 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-d42vd"] Mar 18 16:00:31 crc kubenswrapper[4792]: I0318 16:00:31.909051 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:32 crc kubenswrapper[4792]: W0318 16:00:32.149007 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d0fb49e_571e_4c29_9e2f_6f9ec6c1978b.slice/crio-718115b6eacf5547ffd79e1b089f0d7ca59750b7f9e1577bd6145645a0345327 WatchSource:0}: Error finding container 718115b6eacf5547ffd79e1b089f0d7ca59750b7f9e1577bd6145645a0345327: Status 404 returned error can't find the container with id 718115b6eacf5547ffd79e1b089f0d7ca59750b7f9e1577bd6145645a0345327 Mar 18 16:00:32 crc kubenswrapper[4792]: I0318 16:00:32.192790 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:32 crc kubenswrapper[4792]: I0318 16:00:32.368121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f","Type":"ContainerStarted","Data":"e4586617a7fe0c2c4fffbc16b8d29eefdd4da2432f7c40a9f5545e9bcdb4f531"} Mar 18 16:00:32 crc kubenswrapper[4792]: I0318 16:00:32.373220 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b","Type":"ContainerStarted","Data":"718115b6eacf5547ffd79e1b089f0d7ca59750b7f9e1577bd6145645a0345327"} Mar 18 16:00:32 crc kubenswrapper[4792]: I0318 16:00:32.380287 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d42vd" event={"ID":"5b243f42-c71b-4c56-817e-2345f5502ba6","Type":"ContainerStarted","Data":"6103b7766fae644e35d5c42a9d41bc28957de8c8454bfbc0b9271b8bf804bfa8"} Mar 18 16:00:32 crc kubenswrapper[4792]: W0318 16:00:32.414008 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice/crio-56397a19b13949d2593fef48abb21c20fe39bab38f82eee45f7dc7df53fdfcfb WatchSource:0}: Error finding container 56397a19b13949d2593fef48abb21c20fe39bab38f82eee45f7dc7df53fdfcfb: Status 404 returned error can't find the container with id 56397a19b13949d2593fef48abb21c20fe39bab38f82eee45f7dc7df53fdfcfb Mar 18 16:00:32 crc kubenswrapper[4792]: I0318 16:00:32.435951 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:32 crc kubenswrapper[4792]: I0318 16:00:32.775618 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-mpgvp"] Mar 18 16:00:32 crc kubenswrapper[4792]: I0318 16:00:32.847948 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:32 crc kubenswrapper[4792]: W0318 16:00:32.859882 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a69fcbc_7c8d_4c3b_8f0a_e3b3ab137943.slice/crio-7c4b699f739a8bfefdac82e1d80298763d63b5708ec63ca6a305d83ff1b840cb WatchSource:0}: Error finding container 7c4b699f739a8bfefdac82e1d80298763d63b5708ec63ca6a305d83ff1b840cb: Status 404 returned error can't find the container with id 7c4b699f739a8bfefdac82e1d80298763d63b5708ec63ca6a305d83ff1b840cb Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.475154 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943","Type":"ContainerStarted","Data":"7c4b699f739a8bfefdac82e1d80298763d63b5708ec63ca6a305d83ff1b840cb"} Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.491147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92bs2" event={"ID":"17418767-af15-46e0-b37e-0c1d8102a2e6","Type":"ContainerStarted","Data":"06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa"} Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.503732 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d42vd" event={"ID":"5b243f42-c71b-4c56-817e-2345f5502ba6","Type":"ContainerStarted","Data":"cec2c139bce9cdf2b6818641e8e357f68496af8b0060764db9cd7888366647a4"} Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.535697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerStarted","Data":"70c7de2f4d8bec18968065322c1392bafe623435a04932f566d896d01b4ee47f"} Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.537802 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" event={"ID":"1372258e-0814-4692-9694-7dc28a519871","Type":"ContainerStarted","Data":"1a33052bfda33a8a9fb0cc9080c61991ac4d75feef290497a17f34e0950146c1"} Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.540382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8325ad2-da8e-4a5a-b759-57a468b289b7","Type":"ContainerStarted","Data":"56397a19b13949d2593fef48abb21c20fe39bab38f82eee45f7dc7df53fdfcfb"} Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.551543 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-d42vd" podStartSLOduration=4.551513199 podStartE2EDuration="4.551513199s" podCreationTimestamp="2026-03-18 16:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:33.53634242 +0000 UTC m=+1582.405671347" watchObservedRunningTime="2026-03-18 16:00:33.551513199 +0000 UTC m=+1582.420842136" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.773042 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qd24g"] Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.775391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.780503 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.780762 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.789212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qd24g"] Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.879620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbftv\" (UniqueName: \"kubernetes.io/projected/03a94a71-c815-4ad4-851b-fd5139b6561b-kube-api-access-dbftv\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.879788 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.879853 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-scripts\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.879885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-config-data\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.981768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.981853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-scripts\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.981888 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-config-data\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.982097 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbftv\" (UniqueName: \"kubernetes.io/projected/03a94a71-c815-4ad4-851b-fd5139b6561b-kube-api-access-dbftv\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.991166 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-config-data\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:33 crc kubenswrapper[4792]: I0318 16:00:33.997877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.002800 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbftv\" (UniqueName: \"kubernetes.io/projected/03a94a71-c815-4ad4-851b-fd5139b6561b-kube-api-access-dbftv\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.008466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-scripts\") pod \"nova-cell1-conductor-db-sync-qd24g\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.145366 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.583716 4792 generic.go:334] "Generic (PLEG): container finished" podID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerID="06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa" exitCode=0 Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.585042 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92bs2" event={"ID":"17418767-af15-46e0-b37e-0c1d8102a2e6","Type":"ContainerDied","Data":"06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa"} Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.615650 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerStarted","Data":"2bcf1551b2d8280271964aa81595c14b128ccdd501607219cec8ff0687ef672e"} Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.635772 4792 generic.go:334] "Generic (PLEG): container finished" podID="1372258e-0814-4692-9694-7dc28a519871" containerID="98ba62d9ae84759c61ea2884a57c6995a098d7af6049c24ba54da12f5d6e52ac" exitCode=0 Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.637581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" event={"ID":"1372258e-0814-4692-9694-7dc28a519871","Type":"ContainerDied","Data":"98ba62d9ae84759c61ea2884a57c6995a098d7af6049c24ba54da12f5d6e52ac"} Mar 18 16:00:34 crc kubenswrapper[4792]: I0318 16:00:34.862846 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qd24g"] Mar 18 16:00:35 crc kubenswrapper[4792]: I0318 16:00:35.684607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" event={"ID":"1372258e-0814-4692-9694-7dc28a519871","Type":"ContainerStarted","Data":"d047c24c5ec62e5910239540f5637cac1516bdfb1ea1a76d7a82c98a343a06a3"} Mar 18 16:00:35 crc kubenswrapper[4792]: I0318 16:00:35.685316 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:35 crc kubenswrapper[4792]: I0318 16:00:35.713545 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qd24g" event={"ID":"03a94a71-c815-4ad4-851b-fd5139b6561b","Type":"ContainerStarted","Data":"2d24ea1bd8d3af1178b2b52d434a09543425388c352c0dfe2afa8d021176fefb"} Mar 18 16:00:35 crc kubenswrapper[4792]: I0318 16:00:35.713607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qd24g" event={"ID":"03a94a71-c815-4ad4-851b-fd5139b6561b","Type":"ContainerStarted","Data":"4cfef55ed27ff74c386eefd67043a4f14c57a4c8c3aa79d7448dc60475a3f477"} Mar 18 16:00:35 crc kubenswrapper[4792]: I0318 16:00:35.749214 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" podStartSLOduration=5.749195248 podStartE2EDuration="5.749195248s" podCreationTimestamp="2026-03-18 16:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:35.729640013 +0000 UTC m=+1584.598968960" watchObservedRunningTime="2026-03-18 16:00:35.749195248 +0000 UTC m=+1584.618524185" Mar 18 16:00:35 crc kubenswrapper[4792]: I0318 16:00:35.765223 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qd24g" podStartSLOduration=2.7652041819999997 podStartE2EDuration="2.765204182s" podCreationTimestamp="2026-03-18 16:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:35.746479724 +0000 UTC m=+1584.615808661" watchObservedRunningTime="2026-03-18 16:00:35.765204182 +0000 UTC m=+1584.634533119" Mar 18 16:00:36 crc kubenswrapper[4792]: I0318 16:00:36.124250 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:36 crc kubenswrapper[4792]: I0318 16:00:36.156136 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:00:36 crc kubenswrapper[4792]: I0318 16:00:36.744506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerStarted","Data":"a3896afa3594edc31162e98b4fa0a9edc6157d82b71145250f6bace9b6696131"} Mar 18 16:00:38 crc kubenswrapper[4792]: I0318 16:00:38.777979 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92bs2" event={"ID":"17418767-af15-46e0-b37e-0c1d8102a2e6","Type":"ContainerStarted","Data":"dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7"} Mar 18 16:00:38 crc kubenswrapper[4792]: I0318 16:00:38.831566 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:38 crc kubenswrapper[4792]: I0318 16:00:38.831782 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:00:39 crc kubenswrapper[4792]: I0318 16:00:39.977236 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-92bs2" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:00:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:00:39 crc kubenswrapper[4792]: > Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.821235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f","Type":"ContainerStarted","Data":"fb2e238d7c647ed90fc90d65663da0dea60fad6f405f604024803cbe0384a256"} Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.825615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b","Type":"ContainerStarted","Data":"e78d99797d220dd1aadb329e24b727d62c250a95652771e1e1ce42674475c0a6"} Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.828699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943","Type":"ContainerStarted","Data":"40d613cdf42ce7c11d1f7b6595725f0e8f341b7405d9e93aa25378235b59b0a6"} Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.832531 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerStarted","Data":"c5f75a1124305a171c689431ff37fa7aa71147d025af3cdd4d81a00f69dd9046"} Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.833659 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.836376 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e8325ad2-da8e-4a5a-b759-57a468b289b7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7" gracePeriod=30 Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.836639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8325ad2-da8e-4a5a-b759-57a468b289b7","Type":"ContainerStarted","Data":"9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7"} Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.857950 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.016230851 podStartE2EDuration="11.857927217s" podCreationTimestamp="2026-03-18 16:00:29 +0000 UTC" firstStartedPulling="2026-03-18 16:00:31.981156194 +0000 UTC m=+1580.850485131" lastFinishedPulling="2026-03-18 16:00:39.82285256 +0000 UTC m=+1588.692181497" observedRunningTime="2026-03-18 16:00:40.846142433 +0000 UTC m=+1589.715471380" watchObservedRunningTime="2026-03-18 16:00:40.857927217 +0000 UTC m=+1589.727256154" Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.863439 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-92bs2" podStartSLOduration=7.973712577 podStartE2EDuration="12.863421447s" podCreationTimestamp="2026-03-18 16:00:28 +0000 UTC" firstStartedPulling="2026-03-18 16:00:30.289273781 +0000 UTC m=+1579.158602728" lastFinishedPulling="2026-03-18 16:00:35.178982661 +0000 UTC m=+1584.048311598" observedRunningTime="2026-03-18 16:00:38.799533515 +0000 UTC m=+1587.668862472" watchObservedRunningTime="2026-03-18 16:00:40.863421447 +0000 UTC m=+1589.732750384" Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.897319 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.391593668 podStartE2EDuration="11.897298365s" podCreationTimestamp="2026-03-18 16:00:29 +0000 UTC" firstStartedPulling="2026-03-18 16:00:30.318219056 +0000 UTC m=+1579.187547993" lastFinishedPulling="2026-03-18 16:00:39.823923753 +0000 UTC m=+1588.693252690" observedRunningTime="2026-03-18 16:00:40.887499542 +0000 UTC m=+1589.756828479" watchObservedRunningTime="2026-03-18 16:00:40.897298365 +0000 UTC m=+1589.766627302" Mar 18 16:00:40 crc kubenswrapper[4792]: I0318 16:00:40.914281 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.517385434 podStartE2EDuration="10.914262059s" podCreationTimestamp="2026-03-18 16:00:30 +0000 UTC" firstStartedPulling="2026-03-18 16:00:32.417918406 +0000 UTC m=+1581.287247343" lastFinishedPulling="2026-03-18 16:00:39.814795041 +0000 UTC m=+1588.684123968" observedRunningTime="2026-03-18 16:00:40.909390979 +0000 UTC m=+1589.778719916" watchObservedRunningTime="2026-03-18 16:00:40.914262059 +0000 UTC m=+1589.783590986" Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.260823 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.275169 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.377011 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bdzhh"] Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.377254 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" podUID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerName="dnsmasq-dns" containerID="cri-o://4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b" gracePeriod=10 Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.898294 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b","Type":"ContainerStarted","Data":"72be51b54b952a311cad0e5ffcd75a4039ec6ed2917a70af57cac7dc29845891"} Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.923582 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-log" containerID="cri-o://40d613cdf42ce7c11d1f7b6595725f0e8f341b7405d9e93aa25378235b59b0a6" gracePeriod=30 Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.924126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943","Type":"ContainerStarted","Data":"e6175dd0dfeab317de6f81e2930b0269d6141c5068afcc0eed0fcbcf348af0af"} Mar 18 16:00:41 crc kubenswrapper[4792]: I0318 16:00:41.926114 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-metadata" containerID="cri-o://e6175dd0dfeab317de6f81e2930b0269d6141c5068afcc0eed0fcbcf348af0af" gracePeriod=30 Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.039753 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.409050828 podStartE2EDuration="13.039731371s" podCreationTimestamp="2026-03-18 16:00:29 +0000 UTC" firstStartedPulling="2026-03-18 16:00:32.192177577 +0000 UTC m=+1581.061506514" lastFinishedPulling="2026-03-18 16:00:39.82285812 +0000 UTC m=+1588.692187057" observedRunningTime="2026-03-18 16:00:42.038702439 +0000 UTC m=+1590.908031376" watchObservedRunningTime="2026-03-18 16:00:42.039731371 +0000 UTC m=+1590.909060308" Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.044634 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.11719937 podStartE2EDuration="12.044617902s" podCreationTimestamp="2026-03-18 16:00:30 +0000 UTC" firstStartedPulling="2026-03-18 16:00:32.895398836 +0000 UTC m=+1581.764727773" lastFinishedPulling="2026-03-18 16:00:39.822817368 +0000 UTC m=+1588.692146305" observedRunningTime="2026-03-18 16:00:42.011811048 +0000 UTC m=+1590.881139995" watchObservedRunningTime="2026-03-18 16:00:42.044617902 +0000 UTC m=+1590.913946839" Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.749041 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.857884 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:00:42 crc kubenswrapper[4792]: E0318 16:00:42.858327 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.906050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-svc\") pod \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.906100 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fflkg\" (UniqueName: \"kubernetes.io/projected/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-kube-api-access-fflkg\") pod \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.906163 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-sb\") pod \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.906215 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-config\") pod \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.906297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-nb\") pod \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.906404 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-swift-storage-0\") pod \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\" (UID: \"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f\") " Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.928578 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-kube-api-access-fflkg" (OuterVolumeSpecName: "kube-api-access-fflkg") pod "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" (UID: "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f"). InnerVolumeSpecName "kube-api-access-fflkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.948733 4792 generic.go:334] "Generic (PLEG): container finished" podID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerID="4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b" exitCode=0 Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.948861 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.949318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" event={"ID":"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f","Type":"ContainerDied","Data":"4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b"} Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.949482 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-bdzhh" event={"ID":"a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f","Type":"ContainerDied","Data":"7e5ac2061537a60c39384eb15e494d79c8b26def08deb4adf3162dd0d4283474"} Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.949647 4792 scope.go:117] "RemoveContainer" containerID="4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b" Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.953192 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerID="e6175dd0dfeab317de6f81e2930b0269d6141c5068afcc0eed0fcbcf348af0af" exitCode=0 Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.953230 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerID="40d613cdf42ce7c11d1f7b6595725f0e8f341b7405d9e93aa25378235b59b0a6" exitCode=143 Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.953830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943","Type":"ContainerDied","Data":"e6175dd0dfeab317de6f81e2930b0269d6141c5068afcc0eed0fcbcf348af0af"} Mar 18 16:00:42 crc kubenswrapper[4792]: I0318 16:00:42.960591 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943","Type":"ContainerDied","Data":"40d613cdf42ce7c11d1f7b6595725f0e8f341b7405d9e93aa25378235b59b0a6"} Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.009330 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fflkg\" (UniqueName: \"kubernetes.io/projected/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-kube-api-access-fflkg\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.074819 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" (UID: "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.118378 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.135436 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" (UID: "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.178402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" (UID: "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.195367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" (UID: "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.202511 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-config" (OuterVolumeSpecName: "config") pod "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" (UID: "a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.220524 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.220872 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.222100 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.222216 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.326003 4792 scope.go:117] "RemoveContainer" containerID="2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.326719 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.355645 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bdzhh"] Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.355845 4792 scope.go:117] "RemoveContainer" containerID="4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b" Mar 18 16:00:43 crc kubenswrapper[4792]: E0318 16:00:43.356796 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b\": container with ID starting with 4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b not found: ID does not exist" containerID="4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.356934 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b"} err="failed to get container status \"4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b\": rpc error: code = NotFound desc = could not find container \"4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b\": container with ID starting with 4319d3fb1bbd9dc29f87de3e14ede4beaebc5e9106ba131dcd0d18fd0db2b01b not found: ID does not exist" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.357087 4792 scope.go:117] "RemoveContainer" containerID="2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1" Mar 18 16:00:43 crc kubenswrapper[4792]: E0318 16:00:43.357504 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1\": container with ID starting with 2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1 not found: ID does not exist" containerID="2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.357610 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1"} err="failed to get container status \"2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1\": rpc error: code = NotFound desc = could not find container \"2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1\": container with ID starting with 2fde74247578042f1d8e3ea925e1a6e0b7ae83c78d621f7b6a03b345f68686a1 not found: ID does not exist" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.382193 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-bdzhh"] Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.528888 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-logs\") pod \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.528999 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-config-data\") pod \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.529213 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drrlt\" (UniqueName: \"kubernetes.io/projected/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-kube-api-access-drrlt\") pod \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.529335 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-combined-ca-bundle\") pod \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\" (UID: \"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943\") " Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.531358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-logs" (OuterVolumeSpecName: "logs") pod "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" (UID: "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.545237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-kube-api-access-drrlt" (OuterVolumeSpecName: "kube-api-access-drrlt") pod "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" (UID: "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943"). InnerVolumeSpecName "kube-api-access-drrlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.585336 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-config-data" (OuterVolumeSpecName: "config-data") pod "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" (UID: "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.587157 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" (UID: "7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.591231 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.632605 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.633669 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.633779 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drrlt\" (UniqueName: \"kubernetes.io/projected/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-kube-api-access-drrlt\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.633896 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.872106 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" path="/var/lib/kubelet/pods/a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f/volumes" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.973052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943","Type":"ContainerDied","Data":"7c4b699f739a8bfefdac82e1d80298763d63b5708ec63ca6a305d83ff1b840cb"} Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.973162 4792 scope.go:117] "RemoveContainer" containerID="e6175dd0dfeab317de6f81e2930b0269d6141c5068afcc0eed0fcbcf348af0af" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.973252 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.975557 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="proxy-httpd" containerID="cri-o://c5f75a1124305a171c689431ff37fa7aa71147d025af3cdd4d81a00f69dd9046" gracePeriod=30 Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.975564 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-central-agent" containerID="cri-o://70c7de2f4d8bec18968065322c1392bafe623435a04932f566d896d01b4ee47f" gracePeriod=30 Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.975599 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="sg-core" containerID="cri-o://a3896afa3594edc31162e98b4fa0a9edc6157d82b71145250f6bace9b6696131" gracePeriod=30 Mar 18 16:00:43 crc kubenswrapper[4792]: I0318 16:00:43.975684 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-notification-agent" containerID="cri-o://2bcf1551b2d8280271964aa81595c14b128ccdd501607219cec8ff0687ef672e" gracePeriod=30 Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.062211 4792 scope.go:117] "RemoveContainer" containerID="40d613cdf42ce7c11d1f7b6595725f0e8f341b7405d9e93aa25378235b59b0a6" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.111959 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.139330 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.176296 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:44 crc kubenswrapper[4792]: E0318 16:00:44.176939 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerName="init" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.176960 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerName="init" Mar 18 16:00:44 crc kubenswrapper[4792]: E0318 16:00:44.177001 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-log" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.177010 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-log" Mar 18 16:00:44 crc kubenswrapper[4792]: E0318 16:00:44.177040 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerName="dnsmasq-dns" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.177050 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerName="dnsmasq-dns" Mar 18 16:00:44 crc kubenswrapper[4792]: E0318 16:00:44.177100 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-metadata" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.177109 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-metadata" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.177411 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d39a94-cc37-4e16-91ad-a3a2a3fdda9f" containerName="dnsmasq-dns" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.177451 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-log" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.177486 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" containerName="nova-metadata-metadata" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.179216 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.184633 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.184889 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.198175 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.258081 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.258200 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.258268 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz62s\" (UniqueName: \"kubernetes.io/projected/271c2892-b50c-400d-ab7e-6bc71ae05045-kube-api-access-sz62s\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.258371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-config-data\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.258452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271c2892-b50c-400d-ab7e-6bc71ae05045-logs\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.359778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-config-data\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.360116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271c2892-b50c-400d-ab7e-6bc71ae05045-logs\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.360275 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.360443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.360569 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz62s\" (UniqueName: \"kubernetes.io/projected/271c2892-b50c-400d-ab7e-6bc71ae05045-kube-api-access-sz62s\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.361174 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271c2892-b50c-400d-ab7e-6bc71ae05045-logs\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.366597 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.366796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-config-data\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.379709 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.380577 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz62s\" (UniqueName: \"kubernetes.io/projected/271c2892-b50c-400d-ab7e-6bc71ae05045-kube-api-access-sz62s\") pod \"nova-metadata-0\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " pod="openstack/nova-metadata-0" Mar 18 16:00:44 crc kubenswrapper[4792]: I0318 16:00:44.619509 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.035260 4792 generic.go:334] "Generic (PLEG): container finished" podID="c58cd8cc-5417-43bf-b198-b14699bef942" containerID="c5f75a1124305a171c689431ff37fa7aa71147d025af3cdd4d81a00f69dd9046" exitCode=0 Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.035543 4792 generic.go:334] "Generic (PLEG): container finished" podID="c58cd8cc-5417-43bf-b198-b14699bef942" containerID="a3896afa3594edc31162e98b4fa0a9edc6157d82b71145250f6bace9b6696131" exitCode=2 Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.035554 4792 generic.go:334] "Generic (PLEG): container finished" podID="c58cd8cc-5417-43bf-b198-b14699bef942" containerID="2bcf1551b2d8280271964aa81595c14b128ccdd501607219cec8ff0687ef672e" exitCode=0 Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.035593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerDied","Data":"c5f75a1124305a171c689431ff37fa7aa71147d025af3cdd4d81a00f69dd9046"} Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.035624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerDied","Data":"a3896afa3594edc31162e98b4fa0a9edc6157d82b71145250f6bace9b6696131"} Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.035634 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerDied","Data":"2bcf1551b2d8280271964aa81595c14b128ccdd501607219cec8ff0687ef672e"} Mar 18 16:00:45 crc kubenswrapper[4792]: W0318 16:00:45.414097 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod271c2892_b50c_400d_ab7e_6bc71ae05045.slice/crio-379ee30a543468d6af565a1f85714b9823a8845d6d9a5273b73b05f5742d1cb6 WatchSource:0}: Error finding container 379ee30a543468d6af565a1f85714b9823a8845d6d9a5273b73b05f5742d1cb6: Status 404 returned error can't find the container with id 379ee30a543468d6af565a1f85714b9823a8845d6d9a5273b73b05f5742d1cb6 Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.419106 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.709231 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:00:45 crc kubenswrapper[4792]: I0318 16:00:45.872867 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943" path="/var/lib/kubelet/pods/7a69fcbc-7c8d-4c3b-8f0a-e3b3ab137943/volumes" Mar 18 16:00:46 crc kubenswrapper[4792]: I0318 16:00:46.063314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"271c2892-b50c-400d-ab7e-6bc71ae05045","Type":"ContainerStarted","Data":"2ef1fcad6b84196353263a2e3b986f39094d2395a2a36d0fb8901c07382e43ba"} Mar 18 16:00:46 crc kubenswrapper[4792]: I0318 16:00:46.063359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"271c2892-b50c-400d-ab7e-6bc71ae05045","Type":"ContainerStarted","Data":"1d48f06f1f70bb85c05161e4448702231bc1c33a9b20680b845badaf2be708fc"} Mar 18 16:00:46 crc kubenswrapper[4792]: I0318 16:00:46.063371 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"271c2892-b50c-400d-ab7e-6bc71ae05045","Type":"ContainerStarted","Data":"379ee30a543468d6af565a1f85714b9823a8845d6d9a5273b73b05f5742d1cb6"} Mar 18 16:00:46 crc kubenswrapper[4792]: I0318 16:00:46.115475 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.115452578 podStartE2EDuration="2.115452578s" podCreationTimestamp="2026-03-18 16:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:46.100060363 +0000 UTC m=+1594.969389290" watchObservedRunningTime="2026-03-18 16:00:46.115452578 +0000 UTC m=+1594.984781515" Mar 18 16:00:47 crc kubenswrapper[4792]: I0318 16:00:47.074493 4792 generic.go:334] "Generic (PLEG): container finished" podID="5b243f42-c71b-4c56-817e-2345f5502ba6" containerID="cec2c139bce9cdf2b6818641e8e357f68496af8b0060764db9cd7888366647a4" exitCode=0 Mar 18 16:00:47 crc kubenswrapper[4792]: I0318 16:00:47.074601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d42vd" event={"ID":"5b243f42-c71b-4c56-817e-2345f5502ba6","Type":"ContainerDied","Data":"cec2c139bce9cdf2b6818641e8e357f68496af8b0060764db9cd7888366647a4"} Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.673277 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.673795 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.712726 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.790830 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-combined-ca-bundle\") pod \"5b243f42-c71b-4c56-817e-2345f5502ba6\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.791869 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbxmc\" (UniqueName: \"kubernetes.io/projected/5b243f42-c71b-4c56-817e-2345f5502ba6-kube-api-access-hbxmc\") pod \"5b243f42-c71b-4c56-817e-2345f5502ba6\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.792160 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-config-data\") pod \"5b243f42-c71b-4c56-817e-2345f5502ba6\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.792264 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-scripts\") pod \"5b243f42-c71b-4c56-817e-2345f5502ba6\" (UID: \"5b243f42-c71b-4c56-817e-2345f5502ba6\") " Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.806680 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b243f42-c71b-4c56-817e-2345f5502ba6-kube-api-access-hbxmc" (OuterVolumeSpecName: "kube-api-access-hbxmc") pod "5b243f42-c71b-4c56-817e-2345f5502ba6" (UID: "5b243f42-c71b-4c56-817e-2345f5502ba6"). InnerVolumeSpecName "kube-api-access-hbxmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.814192 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-scripts" (OuterVolumeSpecName: "scripts") pod "5b243f42-c71b-4c56-817e-2345f5502ba6" (UID: "5b243f42-c71b-4c56-817e-2345f5502ba6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.853499 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-config-data" (OuterVolumeSpecName: "config-data") pod "5b243f42-c71b-4c56-817e-2345f5502ba6" (UID: "5b243f42-c71b-4c56-817e-2345f5502ba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.855568 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b243f42-c71b-4c56-817e-2345f5502ba6" (UID: "5b243f42-c71b-4c56-817e-2345f5502ba6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.898105 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.898139 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbxmc\" (UniqueName: \"kubernetes.io/projected/5b243f42-c71b-4c56-817e-2345f5502ba6-kube-api-access-hbxmc\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.898152 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:48 crc kubenswrapper[4792]: I0318 16:00:48.898160 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b243f42-c71b-4c56-817e-2345f5502ba6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.104399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-d42vd" event={"ID":"5b243f42-c71b-4c56-817e-2345f5502ba6","Type":"ContainerDied","Data":"6103b7766fae644e35d5c42a9d41bc28957de8c8454bfbc0b9271b8bf804bfa8"} Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.104444 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6103b7766fae644e35d5c42a9d41bc28957de8c8454bfbc0b9271b8bf804bfa8" Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.104512 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-d42vd" Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.282406 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.282677 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-log" containerID="cri-o://e78d99797d220dd1aadb329e24b727d62c250a95652771e1e1ce42674475c0a6" gracePeriod=30 Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.283274 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-api" containerID="cri-o://72be51b54b952a311cad0e5ffcd75a4039ec6ed2917a70af57cac7dc29845891" gracePeriod=30 Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.295581 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.295798 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" containerName="nova-scheduler-scheduler" containerID="cri-o://fb2e238d7c647ed90fc90d65663da0dea60fad6f405f604024803cbe0384a256" gracePeriod=30 Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.349677 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.350231 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-log" containerID="cri-o://1d48f06f1f70bb85c05161e4448702231bc1c33a9b20680b845badaf2be708fc" gracePeriod=30 Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.350316 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-metadata" containerID="cri-o://2ef1fcad6b84196353263a2e3b986f39094d2395a2a36d0fb8901c07382e43ba" gracePeriod=30 Mar 18 16:00:49 crc kubenswrapper[4792]: I0318 16:00:49.881223 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-92bs2" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:00:49 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:00:49 crc kubenswrapper[4792]: > Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.135912 4792 generic.go:334] "Generic (PLEG): container finished" podID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerID="2ef1fcad6b84196353263a2e3b986f39094d2395a2a36d0fb8901c07382e43ba" exitCode=0 Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.135953 4792 generic.go:334] "Generic (PLEG): container finished" podID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerID="1d48f06f1f70bb85c05161e4448702231bc1c33a9b20680b845badaf2be708fc" exitCode=143 Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.136024 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"271c2892-b50c-400d-ab7e-6bc71ae05045","Type":"ContainerDied","Data":"2ef1fcad6b84196353263a2e3b986f39094d2395a2a36d0fb8901c07382e43ba"} Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.136058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"271c2892-b50c-400d-ab7e-6bc71ae05045","Type":"ContainerDied","Data":"1d48f06f1f70bb85c05161e4448702231bc1c33a9b20680b845badaf2be708fc"} Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.138575 4792 generic.go:334] "Generic (PLEG): container finished" podID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerID="72be51b54b952a311cad0e5ffcd75a4039ec6ed2917a70af57cac7dc29845891" exitCode=0 Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.138608 4792 generic.go:334] "Generic (PLEG): container finished" podID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerID="e78d99797d220dd1aadb329e24b727d62c250a95652771e1e1ce42674475c0a6" exitCode=143 Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.138627 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b","Type":"ContainerDied","Data":"72be51b54b952a311cad0e5ffcd75a4039ec6ed2917a70af57cac7dc29845891"} Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.138648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b","Type":"ContainerDied","Data":"e78d99797d220dd1aadb329e24b727d62c250a95652771e1e1ce42674475c0a6"} Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.471155 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.480757 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564349 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-combined-ca-bundle\") pod \"271c2892-b50c-400d-ab7e-6bc71ae05045\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk5rw\" (UniqueName: \"kubernetes.io/projected/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-kube-api-access-kk5rw\") pod \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564548 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz62s\" (UniqueName: \"kubernetes.io/projected/271c2892-b50c-400d-ab7e-6bc71ae05045-kube-api-access-sz62s\") pod \"271c2892-b50c-400d-ab7e-6bc71ae05045\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564630 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-combined-ca-bundle\") pod \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271c2892-b50c-400d-ab7e-6bc71ae05045-logs\") pod \"271c2892-b50c-400d-ab7e-6bc71ae05045\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-config-data\") pod \"271c2892-b50c-400d-ab7e-6bc71ae05045\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-nova-metadata-tls-certs\") pod \"271c2892-b50c-400d-ab7e-6bc71ae05045\" (UID: \"271c2892-b50c-400d-ab7e-6bc71ae05045\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-config-data\") pod \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.564842 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-logs\") pod \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\" (UID: \"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b\") " Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.566546 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271c2892-b50c-400d-ab7e-6bc71ae05045-logs" (OuterVolumeSpecName: "logs") pod "271c2892-b50c-400d-ab7e-6bc71ae05045" (UID: "271c2892-b50c-400d-ab7e-6bc71ae05045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.583218 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-logs" (OuterVolumeSpecName: "logs") pod "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" (UID: "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.587165 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-kube-api-access-kk5rw" (OuterVolumeSpecName: "kube-api-access-kk5rw") pod "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" (UID: "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b"). InnerVolumeSpecName "kube-api-access-kk5rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.600288 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271c2892-b50c-400d-ab7e-6bc71ae05045-kube-api-access-sz62s" (OuterVolumeSpecName: "kube-api-access-sz62s") pod "271c2892-b50c-400d-ab7e-6bc71ae05045" (UID: "271c2892-b50c-400d-ab7e-6bc71ae05045"). InnerVolumeSpecName "kube-api-access-sz62s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.624179 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" (UID: "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.653266 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "271c2892-b50c-400d-ab7e-6bc71ae05045" (UID: "271c2892-b50c-400d-ab7e-6bc71ae05045"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.661442 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-config-data" (OuterVolumeSpecName: "config-data") pod "271c2892-b50c-400d-ab7e-6bc71ae05045" (UID: "271c2892-b50c-400d-ab7e-6bc71ae05045"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.670519 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.670567 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271c2892-b50c-400d-ab7e-6bc71ae05045-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.670581 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.670594 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.670605 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.670617 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk5rw\" (UniqueName: \"kubernetes.io/projected/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-kube-api-access-kk5rw\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.670631 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz62s\" (UniqueName: \"kubernetes.io/projected/271c2892-b50c-400d-ab7e-6bc71ae05045-kube-api-access-sz62s\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.697549 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "271c2892-b50c-400d-ab7e-6bc71ae05045" (UID: "271c2892-b50c-400d-ab7e-6bc71ae05045"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.701126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-config-data" (OuterVolumeSpecName: "config-data") pod "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" (UID: "5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.772761 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/271c2892-b50c-400d-ab7e-6bc71ae05045-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:50 crc kubenswrapper[4792]: I0318 16:00:50.772793 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.169832 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" containerID="fb2e238d7c647ed90fc90d65663da0dea60fad6f405f604024803cbe0384a256" exitCode=0 Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.169937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f","Type":"ContainerDied","Data":"fb2e238d7c647ed90fc90d65663da0dea60fad6f405f604024803cbe0384a256"} Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.193561 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b","Type":"ContainerDied","Data":"718115b6eacf5547ffd79e1b089f0d7ca59750b7f9e1577bd6145645a0345327"} Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.193616 4792 scope.go:117] "RemoveContainer" containerID="72be51b54b952a311cad0e5ffcd75a4039ec6ed2917a70af57cac7dc29845891" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.193755 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.210227 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"271c2892-b50c-400d-ab7e-6bc71ae05045","Type":"ContainerDied","Data":"379ee30a543468d6af565a1f85714b9823a8845d6d9a5273b73b05f5742d1cb6"} Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.210593 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.244325 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.253220 4792 scope.go:117] "RemoveContainer" containerID="e78d99797d220dd1aadb329e24b727d62c250a95652771e1e1ce42674475c0a6" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.287859 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.306314 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: E0318 16:00:51.307593 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-log" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.307624 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-log" Mar 18 16:00:51 crc kubenswrapper[4792]: E0318 16:00:51.307643 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b243f42-c71b-4c56-817e-2345f5502ba6" containerName="nova-manage" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.307652 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b243f42-c71b-4c56-817e-2345f5502ba6" containerName="nova-manage" Mar 18 16:00:51 crc kubenswrapper[4792]: E0318 16:00:51.307698 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-log" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.307707 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-log" Mar 18 16:00:51 crc kubenswrapper[4792]: E0318 16:00:51.307744 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-api" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.307755 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-api" Mar 18 16:00:51 crc kubenswrapper[4792]: E0318 16:00:51.307777 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-metadata" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.307788 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-metadata" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.314723 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-log" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.314776 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" containerName="nova-api-api" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.314817 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-metadata" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.314834 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" containerName="nova-metadata-log" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.314863 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b243f42-c71b-4c56-817e-2345f5502ba6" containerName="nova-manage" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.316618 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.322068 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.348111 4792 scope.go:117] "RemoveContainer" containerID="2ef1fcad6b84196353263a2e3b986f39094d2395a2a36d0fb8901c07382e43ba" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.367244 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.395811 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.397423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-config-data\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.397563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phnzj\" (UniqueName: \"kubernetes.io/projected/47725242-ec4c-452b-8503-92cb2c5a65f6-kube-api-access-phnzj\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.397630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.397739 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47725242-ec4c-452b-8503-92cb2c5a65f6-logs\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.399871 4792 scope.go:117] "RemoveContainer" containerID="1d48f06f1f70bb85c05161e4448702231bc1c33a9b20680b845badaf2be708fc" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.410131 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.423025 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.425762 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.433570 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.433747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.445053 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500295 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-config-data\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500438 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rgwb\" (UniqueName: \"kubernetes.io/projected/2bff767f-8a15-46ae-bec8-28c8d641847b-kube-api-access-6rgwb\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500478 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-config-data\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500551 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phnzj\" (UniqueName: \"kubernetes.io/projected/47725242-ec4c-452b-8503-92cb2c5a65f6-kube-api-access-phnzj\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500573 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bff767f-8a15-46ae-bec8-28c8d641847b-logs\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500617 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500688 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.500722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47725242-ec4c-452b-8503-92cb2c5a65f6-logs\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.501165 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47725242-ec4c-452b-8503-92cb2c5a65f6-logs\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.507618 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.507769 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-config-data\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.559914 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phnzj\" (UniqueName: \"kubernetes.io/projected/47725242-ec4c-452b-8503-92cb2c5a65f6-kube-api-access-phnzj\") pod \"nova-api-0\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.606213 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.606310 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rgwb\" (UniqueName: \"kubernetes.io/projected/2bff767f-8a15-46ae-bec8-28c8d641847b-kube-api-access-6rgwb\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.606348 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-config-data\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.606425 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bff767f-8a15-46ae-bec8-28c8d641847b-logs\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.606529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.609582 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bff767f-8a15-46ae-bec8-28c8d641847b-logs\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.627687 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.628277 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.631756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-config-data\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.665851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rgwb\" (UniqueName: \"kubernetes.io/projected/2bff767f-8a15-46ae-bec8-28c8d641847b-kube-api-access-6rgwb\") pod \"nova-metadata-0\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.668664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.765491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.966517 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="271c2892-b50c-400d-ab7e-6bc71ae05045" path="/var/lib/kubelet/pods/271c2892-b50c-400d-ab7e-6bc71ae05045/volumes" Mar 18 16:00:51 crc kubenswrapper[4792]: I0318 16:00:51.967687 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b" path="/var/lib/kubelet/pods/5d0fb49e-571e-4c29-9e2f-6f9ec6c1978b/volumes" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.014515 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-4wg26"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.033876 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.098037 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-4wg26"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.108406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.131653 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-operator-scripts\") pod \"aodh-db-create-4wg26\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.132168 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ftj\" (UniqueName: \"kubernetes.io/projected/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-kube-api-access-f8ftj\") pod \"aodh-db-create-4wg26\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.144472 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-cfe5-account-create-update-6tpq5"] Mar 18 16:00:52 crc kubenswrapper[4792]: E0318 16:00:52.145418 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" containerName="nova-scheduler-scheduler" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.145532 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" containerName="nova-scheduler-scheduler" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.145959 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" containerName="nova-scheduler-scheduler" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.147165 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.158260 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.214196 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-cfe5-account-create-update-6tpq5"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.233448 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2r9\" (UniqueName: \"kubernetes.io/projected/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-kube-api-access-xz2r9\") pod \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.233651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-combined-ca-bundle\") pod \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.233739 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-config-data\") pod \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\" (UID: \"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f\") " Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.234038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21421779-aad0-4455-bd2f-90c3344e6fe6-operator-scripts\") pod \"aodh-cfe5-account-create-update-6tpq5\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.234095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxsnn\" (UniqueName: \"kubernetes.io/projected/21421779-aad0-4455-bd2f-90c3344e6fe6-kube-api-access-kxsnn\") pod \"aodh-cfe5-account-create-update-6tpq5\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.234150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-operator-scripts\") pod \"aodh-db-create-4wg26\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.234327 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ftj\" (UniqueName: \"kubernetes.io/projected/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-kube-api-access-f8ftj\") pod \"aodh-db-create-4wg26\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.245018 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-operator-scripts\") pod \"aodh-db-create-4wg26\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.257278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ftj\" (UniqueName: \"kubernetes.io/projected/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-kube-api-access-f8ftj\") pod \"aodh-db-create-4wg26\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.259640 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-kube-api-access-xz2r9" (OuterVolumeSpecName: "kube-api-access-xz2r9") pod "e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" (UID: "e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f"). InnerVolumeSpecName "kube-api-access-xz2r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.270582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f","Type":"ContainerDied","Data":"e4586617a7fe0c2c4fffbc16b8d29eefdd4da2432f7c40a9f5545e9bcdb4f531"} Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.270687 4792 scope.go:117] "RemoveContainer" containerID="fb2e238d7c647ed90fc90d65663da0dea60fad6f405f604024803cbe0384a256" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.271140 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.303447 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-config-data" (OuterVolumeSpecName: "config-data") pod "e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" (UID: "e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.307191 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" (UID: "e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.337340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21421779-aad0-4455-bd2f-90c3344e6fe6-operator-scripts\") pod \"aodh-cfe5-account-create-update-6tpq5\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.337417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxsnn\" (UniqueName: \"kubernetes.io/projected/21421779-aad0-4455-bd2f-90c3344e6fe6-kube-api-access-kxsnn\") pod \"aodh-cfe5-account-create-update-6tpq5\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.337518 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2r9\" (UniqueName: \"kubernetes.io/projected/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-kube-api-access-xz2r9\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.337535 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.337548 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.345211 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21421779-aad0-4455-bd2f-90c3344e6fe6-operator-scripts\") pod \"aodh-cfe5-account-create-update-6tpq5\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.355285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxsnn\" (UniqueName: \"kubernetes.io/projected/21421779-aad0-4455-bd2f-90c3344e6fe6-kube-api-access-kxsnn\") pod \"aodh-cfe5-account-create-update-6tpq5\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.432880 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.499936 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.539836 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.741745 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.946482 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.978048 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.992701 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.994744 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:52 crc kubenswrapper[4792]: I0318 16:00:52.998884 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.013906 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:53 crc kubenswrapper[4792]: W0318 16:00:53.029360 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff6d5ef0_0a5f_4a17_976d_f971ced7792e.slice/crio-669d2755988d522b74319bd618ee9bfe9955c88e3169c8ea2e64f3fa9da39afb WatchSource:0}: Error finding container 669d2755988d522b74319bd618ee9bfe9955c88e3169c8ea2e64f3fa9da39afb: Status 404 returned error can't find the container with id 669d2755988d522b74319bd618ee9bfe9955c88e3169c8ea2e64f3fa9da39afb Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.031500 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-4wg26"] Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.064676 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-config-data\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.064785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fs2\" (UniqueName: \"kubernetes.io/projected/d9a22d88-19a8-4567-ba96-b3d2de7d9553-kube-api-access-p6fs2\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.064919 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.167527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fs2\" (UniqueName: \"kubernetes.io/projected/d9a22d88-19a8-4567-ba96-b3d2de7d9553-kube-api-access-p6fs2\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.167699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.167797 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-config-data\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.176943 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-config-data\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.184985 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.193893 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fs2\" (UniqueName: \"kubernetes.io/projected/d9a22d88-19a8-4567-ba96-b3d2de7d9553-kube-api-access-p6fs2\") pod \"nova-scheduler-0\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.246244 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-cfe5-account-create-update-6tpq5"] Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.304779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47725242-ec4c-452b-8503-92cb2c5a65f6","Type":"ContainerStarted","Data":"c8d5fef621915875bcb901a686dd28c6f785784c9c1ae374c31d9d9d7ea67426"} Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.304827 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47725242-ec4c-452b-8503-92cb2c5a65f6","Type":"ContainerStarted","Data":"fbd7f975405458ad9997710b4785746a66b73afc0cb8605cf24ec84f4f1fdda4"} Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.308647 4792 generic.go:334] "Generic (PLEG): container finished" podID="c58cd8cc-5417-43bf-b198-b14699bef942" containerID="70c7de2f4d8bec18968065322c1392bafe623435a04932f566d896d01b4ee47f" exitCode=0 Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.308714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerDied","Data":"70c7de2f4d8bec18968065322c1392bafe623435a04932f566d896d01b4ee47f"} Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.310190 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4wg26" event={"ID":"ff6d5ef0-0a5f-4a17-976d-f971ced7792e","Type":"ContainerStarted","Data":"669d2755988d522b74319bd618ee9bfe9955c88e3169c8ea2e64f3fa9da39afb"} Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.314109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2bff767f-8a15-46ae-bec8-28c8d641847b","Type":"ContainerStarted","Data":"a982166703556f9443e61f6bf32b7980f4c3b998cdba22829ff5ced6cdba4dd1"} Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.416765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:00:53 crc kubenswrapper[4792]: I0318 16:00:53.883518 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f" path="/var/lib/kubelet/pods/e6b643c0-7645-4c26-9d14-bd9ff4aa7f7f/volumes" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.001294 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.097736 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-run-httpd\") pod \"c58cd8cc-5417-43bf-b198-b14699bef942\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.097823 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-sg-core-conf-yaml\") pod \"c58cd8cc-5417-43bf-b198-b14699bef942\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.097940 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-combined-ca-bundle\") pod \"c58cd8cc-5417-43bf-b198-b14699bef942\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.098253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-scripts\") pod \"c58cd8cc-5417-43bf-b198-b14699bef942\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.098341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbmvd\" (UniqueName: \"kubernetes.io/projected/c58cd8cc-5417-43bf-b198-b14699bef942-kube-api-access-nbmvd\") pod \"c58cd8cc-5417-43bf-b198-b14699bef942\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.098370 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-log-httpd\") pod \"c58cd8cc-5417-43bf-b198-b14699bef942\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.098509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-config-data\") pod \"c58cd8cc-5417-43bf-b198-b14699bef942\" (UID: \"c58cd8cc-5417-43bf-b198-b14699bef942\") " Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.118430 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c58cd8cc-5417-43bf-b198-b14699bef942" (UID: "c58cd8cc-5417-43bf-b198-b14699bef942"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.121079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c58cd8cc-5417-43bf-b198-b14699bef942" (UID: "c58cd8cc-5417-43bf-b198-b14699bef942"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.125401 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.147774 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58cd8cc-5417-43bf-b198-b14699bef942-kube-api-access-nbmvd" (OuterVolumeSpecName: "kube-api-access-nbmvd") pod "c58cd8cc-5417-43bf-b198-b14699bef942" (UID: "c58cd8cc-5417-43bf-b198-b14699bef942"). InnerVolumeSpecName "kube-api-access-nbmvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.152258 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-scripts" (OuterVolumeSpecName: "scripts") pod "c58cd8cc-5417-43bf-b198-b14699bef942" (UID: "c58cd8cc-5417-43bf-b198-b14699bef942"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.190363 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c58cd8cc-5417-43bf-b198-b14699bef942" (UID: "c58cd8cc-5417-43bf-b198-b14699bef942"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.213231 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.213263 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbmvd\" (UniqueName: \"kubernetes.io/projected/c58cd8cc-5417-43bf-b198-b14699bef942-kube-api-access-nbmvd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.213274 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.213285 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c58cd8cc-5417-43bf-b198-b14699bef942-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.213294 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.267673 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c58cd8cc-5417-43bf-b198-b14699bef942" (UID: "c58cd8cc-5417-43bf-b198-b14699bef942"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.316316 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.332113 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-config-data" (OuterVolumeSpecName: "config-data") pod "c58cd8cc-5417-43bf-b198-b14699bef942" (UID: "c58cd8cc-5417-43bf-b198-b14699bef942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.339018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c58cd8cc-5417-43bf-b198-b14699bef942","Type":"ContainerDied","Data":"88761fb8a681082a84162a0659938070d4644ff1ba917fc786e7ba124951fd5d"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.339156 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.339236 4792 scope.go:117] "RemoveContainer" containerID="c5f75a1124305a171c689431ff37fa7aa71147d025af3cdd4d81a00f69dd9046" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.341922 4792 generic.go:334] "Generic (PLEG): container finished" podID="21421779-aad0-4455-bd2f-90c3344e6fe6" containerID="d261b7dd451d531e087d82f9347526c698d26ede95d18889e101ab7c17f6cd6c" exitCode=0 Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.342103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-cfe5-account-create-update-6tpq5" event={"ID":"21421779-aad0-4455-bd2f-90c3344e6fe6","Type":"ContainerDied","Data":"d261b7dd451d531e087d82f9347526c698d26ede95d18889e101ab7c17f6cd6c"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.342175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-cfe5-account-create-update-6tpq5" event={"ID":"21421779-aad0-4455-bd2f-90c3344e6fe6","Type":"ContainerStarted","Data":"23be2245a029df5bf43b5911027e7dd0fa04ad69d1ef76aefd17ec574bc3f8af"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.346538 4792 generic.go:334] "Generic (PLEG): container finished" podID="ff6d5ef0-0a5f-4a17-976d-f971ced7792e" containerID="f3ad500ff66c09c5b3a9e0e708685b7002f322d759f4c0a0a99c29e4142d13a0" exitCode=0 Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.346611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4wg26" event={"ID":"ff6d5ef0-0a5f-4a17-976d-f971ced7792e","Type":"ContainerDied","Data":"f3ad500ff66c09c5b3a9e0e708685b7002f322d759f4c0a0a99c29e4142d13a0"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.349916 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2bff767f-8a15-46ae-bec8-28c8d641847b","Type":"ContainerStarted","Data":"ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.353643 4792 generic.go:334] "Generic (PLEG): container finished" podID="03a94a71-c815-4ad4-851b-fd5139b6561b" containerID="2d24ea1bd8d3af1178b2b52d434a09543425388c352c0dfe2afa8d021176fefb" exitCode=0 Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.353740 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qd24g" event={"ID":"03a94a71-c815-4ad4-851b-fd5139b6561b","Type":"ContainerDied","Data":"2d24ea1bd8d3af1178b2b52d434a09543425388c352c0dfe2afa8d021176fefb"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.355339 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a22d88-19a8-4567-ba96-b3d2de7d9553","Type":"ContainerStarted","Data":"7d9987e8130a1cdab6f0dacbe40a949eceeb87cd8cd10200d67fa6a0d74b48be"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.363101 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47725242-ec4c-452b-8503-92cb2c5a65f6","Type":"ContainerStarted","Data":"8b85b235d026a4985ba81fd2b92992c067602c64c6c9d7c619f4fc2a2b61af19"} Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.384912 4792 scope.go:117] "RemoveContainer" containerID="a3896afa3594edc31162e98b4fa0a9edc6157d82b71145250f6bace9b6696131" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.418934 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58cd8cc-5417-43bf-b198-b14699bef942-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.426898 4792 scope.go:117] "RemoveContainer" containerID="2bcf1551b2d8280271964aa81595c14b128ccdd501607219cec8ff0687ef672e" Mar 18 16:00:54 crc kubenswrapper[4792]: E0318 16:00:54.431630 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21421779_aad0_4455_bd2f_90c3344e6fe6.slice/crio-conmon-d261b7dd451d531e087d82f9347526c698d26ede95d18889e101ab7c17f6cd6c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff6d5ef0_0a5f_4a17_976d_f971ced7792e.slice/crio-conmon-f3ad500ff66c09c5b3a9e0e708685b7002f322d759f4c0a0a99c29e4142d13a0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21421779_aad0_4455_bd2f_90c3344e6fe6.slice/crio-d261b7dd451d531e087d82f9347526c698d26ede95d18889e101ab7c17f6cd6c.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.459234 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.459192135 podStartE2EDuration="3.459192135s" podCreationTimestamp="2026-03-18 16:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:54.432184989 +0000 UTC m=+1603.301513926" watchObservedRunningTime="2026-03-18 16:00:54.459192135 +0000 UTC m=+1603.328521072" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.488030 4792 scope.go:117] "RemoveContainer" containerID="70c7de2f4d8bec18968065322c1392bafe623435a04932f566d896d01b4ee47f" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.552543 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.567254 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.579940 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:54 crc kubenswrapper[4792]: E0318 16:00:54.580939 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="sg-core" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.580986 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="sg-core" Mar 18 16:00:54 crc kubenswrapper[4792]: E0318 16:00:54.581027 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-notification-agent" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.581039 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-notification-agent" Mar 18 16:00:54 crc kubenswrapper[4792]: E0318 16:00:54.581054 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="proxy-httpd" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.581064 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="proxy-httpd" Mar 18 16:00:54 crc kubenswrapper[4792]: E0318 16:00:54.581074 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-central-agent" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.581082 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-central-agent" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.581441 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-notification-agent" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.581478 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="ceilometer-central-agent" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.581511 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="proxy-httpd" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.581527 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" containerName="sg-core" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.585258 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.588367 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.588514 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.599456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.627302 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-log-httpd\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.627423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn44p\" (UniqueName: \"kubernetes.io/projected/1bb99d4b-c646-4518-af8d-d7994f61c47c-kube-api-access-xn44p\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.627651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-run-httpd\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.627865 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-scripts\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.628065 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.628343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-config-data\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.628460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.730253 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.730393 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-log-httpd\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.730446 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn44p\" (UniqueName: \"kubernetes.io/projected/1bb99d4b-c646-4518-af8d-d7994f61c47c-kube-api-access-xn44p\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.730507 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-run-httpd\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.730583 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-scripts\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.730627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.730695 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-config-data\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.731097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-log-httpd\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.731313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-run-httpd\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.734604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-scripts\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.734916 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.735377 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.736165 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-config-data\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.752267 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn44p\" (UniqueName: \"kubernetes.io/projected/1bb99d4b-c646-4518-af8d-d7994f61c47c-kube-api-access-xn44p\") pod \"ceilometer-0\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " pod="openstack/ceilometer-0" Mar 18 16:00:54 crc kubenswrapper[4792]: I0318 16:00:54.993686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:00:55 crc kubenswrapper[4792]: I0318 16:00:55.384700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a22d88-19a8-4567-ba96-b3d2de7d9553","Type":"ContainerStarted","Data":"a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f"} Mar 18 16:00:55 crc kubenswrapper[4792]: I0318 16:00:55.395176 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2bff767f-8a15-46ae-bec8-28c8d641847b","Type":"ContainerStarted","Data":"213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89"} Mar 18 16:00:55 crc kubenswrapper[4792]: I0318 16:00:55.447989 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.447950331 podStartE2EDuration="3.447950331s" podCreationTimestamp="2026-03-18 16:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:55.415145246 +0000 UTC m=+1604.284474543" watchObservedRunningTime="2026-03-18 16:00:55.447950331 +0000 UTC m=+1604.317279268" Mar 18 16:00:55 crc kubenswrapper[4792]: I0318 16:00:55.452702 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.452690228 podStartE2EDuration="4.452690228s" podCreationTimestamp="2026-03-18 16:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:55.434833475 +0000 UTC m=+1604.304162432" watchObservedRunningTime="2026-03-18 16:00:55.452690228 +0000 UTC m=+1604.322019165" Mar 18 16:00:55 crc kubenswrapper[4792]: I0318 16:00:55.665016 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:00:55 crc kubenswrapper[4792]: I0318 16:00:55.878381 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58cd8cc-5417-43bf-b198-b14699bef942" path="/var/lib/kubelet/pods/c58cd8cc-5417-43bf-b198-b14699bef942/volumes" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.086677 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.184451 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxsnn\" (UniqueName: \"kubernetes.io/projected/21421779-aad0-4455-bd2f-90c3344e6fe6-kube-api-access-kxsnn\") pod \"21421779-aad0-4455-bd2f-90c3344e6fe6\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.184700 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21421779-aad0-4455-bd2f-90c3344e6fe6-operator-scripts\") pod \"21421779-aad0-4455-bd2f-90c3344e6fe6\" (UID: \"21421779-aad0-4455-bd2f-90c3344e6fe6\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.186800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21421779-aad0-4455-bd2f-90c3344e6fe6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21421779-aad0-4455-bd2f-90c3344e6fe6" (UID: "21421779-aad0-4455-bd2f-90c3344e6fe6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.191039 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21421779-aad0-4455-bd2f-90c3344e6fe6-kube-api-access-kxsnn" (OuterVolumeSpecName: "kube-api-access-kxsnn") pod "21421779-aad0-4455-bd2f-90c3344e6fe6" (UID: "21421779-aad0-4455-bd2f-90c3344e6fe6"). InnerVolumeSpecName "kube-api-access-kxsnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.288212 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxsnn\" (UniqueName: \"kubernetes.io/projected/21421779-aad0-4455-bd2f-90c3344e6fe6-kube-api-access-kxsnn\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.288564 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21421779-aad0-4455-bd2f-90c3344e6fe6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.294769 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.308619 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.390430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-combined-ca-bundle\") pod \"03a94a71-c815-4ad4-851b-fd5139b6561b\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.390488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-config-data\") pod \"03a94a71-c815-4ad4-851b-fd5139b6561b\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.390548 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-scripts\") pod \"03a94a71-c815-4ad4-851b-fd5139b6561b\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.390619 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8ftj\" (UniqueName: \"kubernetes.io/projected/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-kube-api-access-f8ftj\") pod \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.390642 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-operator-scripts\") pod \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\" (UID: \"ff6d5ef0-0a5f-4a17-976d-f971ced7792e\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.390734 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbftv\" (UniqueName: \"kubernetes.io/projected/03a94a71-c815-4ad4-851b-fd5139b6561b-kube-api-access-dbftv\") pod \"03a94a71-c815-4ad4-851b-fd5139b6561b\" (UID: \"03a94a71-c815-4ad4-851b-fd5139b6561b\") " Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.393589 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff6d5ef0-0a5f-4a17-976d-f971ced7792e" (UID: "ff6d5ef0-0a5f-4a17-976d-f971ced7792e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.396470 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-scripts" (OuterVolumeSpecName: "scripts") pod "03a94a71-c815-4ad4-851b-fd5139b6561b" (UID: "03a94a71-c815-4ad4-851b-fd5139b6561b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.396595 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-kube-api-access-f8ftj" (OuterVolumeSpecName: "kube-api-access-f8ftj") pod "ff6d5ef0-0a5f-4a17-976d-f971ced7792e" (UID: "ff6d5ef0-0a5f-4a17-976d-f971ced7792e"). InnerVolumeSpecName "kube-api-access-f8ftj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.397305 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a94a71-c815-4ad4-851b-fd5139b6561b-kube-api-access-dbftv" (OuterVolumeSpecName: "kube-api-access-dbftv") pod "03a94a71-c815-4ad4-851b-fd5139b6561b" (UID: "03a94a71-c815-4ad4-851b-fd5139b6561b"). InnerVolumeSpecName "kube-api-access-dbftv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.431078 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03a94a71-c815-4ad4-851b-fd5139b6561b" (UID: "03a94a71-c815-4ad4-851b-fd5139b6561b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.438346 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerStarted","Data":"22e679d883c9b481d310020eacf07b3e4ad5b3f56181dfe75941977a58de2a5c"} Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.440669 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qd24g" event={"ID":"03a94a71-c815-4ad4-851b-fd5139b6561b","Type":"ContainerDied","Data":"4cfef55ed27ff74c386eefd67043a4f14c57a4c8c3aa79d7448dc60475a3f477"} Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.440700 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cfef55ed27ff74c386eefd67043a4f14c57a4c8c3aa79d7448dc60475a3f477" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.440751 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qd24g" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.442937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-cfe5-account-create-update-6tpq5" event={"ID":"21421779-aad0-4455-bd2f-90c3344e6fe6","Type":"ContainerDied","Data":"23be2245a029df5bf43b5911027e7dd0fa04ad69d1ef76aefd17ec574bc3f8af"} Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.442980 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23be2245a029df5bf43b5911027e7dd0fa04ad69d1ef76aefd17ec574bc3f8af" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.443029 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-cfe5-account-create-update-6tpq5" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.455531 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4wg26" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.457554 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-config-data" (OuterVolumeSpecName: "config-data") pod "03a94a71-c815-4ad4-851b-fd5139b6561b" (UID: "03a94a71-c815-4ad4-851b-fd5139b6561b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.455617 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4wg26" event={"ID":"ff6d5ef0-0a5f-4a17-976d-f971ced7792e","Type":"ContainerDied","Data":"669d2755988d522b74319bd618ee9bfe9955c88e3169c8ea2e64f3fa9da39afb"} Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.461485 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669d2755988d522b74319bd618ee9bfe9955c88e3169c8ea2e64f3fa9da39afb" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.497822 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.497856 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.497870 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8ftj\" (UniqueName: \"kubernetes.io/projected/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-kube-api-access-f8ftj\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.497883 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d5ef0-0a5f-4a17-976d-f971ced7792e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.497897 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbftv\" (UniqueName: \"kubernetes.io/projected/03a94a71-c815-4ad4-851b-fd5139b6561b-kube-api-access-dbftv\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.497909 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a94a71-c815-4ad4-851b-fd5139b6561b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.533095 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:00:56 crc kubenswrapper[4792]: E0318 16:00:56.533782 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a94a71-c815-4ad4-851b-fd5139b6561b" containerName="nova-cell1-conductor-db-sync" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.533838 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a94a71-c815-4ad4-851b-fd5139b6561b" containerName="nova-cell1-conductor-db-sync" Mar 18 16:00:56 crc kubenswrapper[4792]: E0318 16:00:56.533863 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6d5ef0-0a5f-4a17-976d-f971ced7792e" containerName="mariadb-database-create" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.533873 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6d5ef0-0a5f-4a17-976d-f971ced7792e" containerName="mariadb-database-create" Mar 18 16:00:56 crc kubenswrapper[4792]: E0318 16:00:56.533904 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21421779-aad0-4455-bd2f-90c3344e6fe6" containerName="mariadb-account-create-update" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.533913 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="21421779-aad0-4455-bd2f-90c3344e6fe6" containerName="mariadb-account-create-update" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.534212 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="21421779-aad0-4455-bd2f-90c3344e6fe6" containerName="mariadb-account-create-update" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.534235 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a94a71-c815-4ad4-851b-fd5139b6561b" containerName="nova-cell1-conductor-db-sync" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.534269 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6d5ef0-0a5f-4a17-976d-f971ced7792e" containerName="mariadb-database-create" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.535351 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.548662 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.600008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b391e203-fca6-4bcb-870c-d04691525743-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.600663 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b391e203-fca6-4bcb-870c-d04691525743-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.600825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45zw8\" (UniqueName: \"kubernetes.io/projected/b391e203-fca6-4bcb-870c-d04691525743-kube-api-access-45zw8\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.703755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b391e203-fca6-4bcb-870c-d04691525743-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.703888 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45zw8\" (UniqueName: \"kubernetes.io/projected/b391e203-fca6-4bcb-870c-d04691525743-kube-api-access-45zw8\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.704098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b391e203-fca6-4bcb-870c-d04691525743-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.709400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b391e203-fca6-4bcb-870c-d04691525743-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.709463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b391e203-fca6-4bcb-870c-d04691525743-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.726558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45zw8\" (UniqueName: \"kubernetes.io/projected/b391e203-fca6-4bcb-870c-d04691525743-kube-api-access-45zw8\") pod \"nova-cell1-conductor-0\" (UID: \"b391e203-fca6-4bcb-870c-d04691525743\") " pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.854103 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:00:56 crc kubenswrapper[4792]: E0318 16:00:56.854421 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:00:56 crc kubenswrapper[4792]: I0318 16:00:56.865656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:57 crc kubenswrapper[4792]: I0318 16:00:57.441714 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 16:00:57 crc kubenswrapper[4792]: W0318 16:00:57.446556 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb391e203_fca6_4bcb_870c_d04691525743.slice/crio-de6f43b91b06e379fd2c3c9969e0104546b7389afd41814dbc6cc810a780c5d9 WatchSource:0}: Error finding container de6f43b91b06e379fd2c3c9969e0104546b7389afd41814dbc6cc810a780c5d9: Status 404 returned error can't find the container with id de6f43b91b06e379fd2c3c9969e0104546b7389afd41814dbc6cc810a780c5d9 Mar 18 16:00:57 crc kubenswrapper[4792]: I0318 16:00:57.477623 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerStarted","Data":"85650a87eb009642b4b2638b8d3467e735b84e5c12ce1f87b5f87581ef221783"} Mar 18 16:00:57 crc kubenswrapper[4792]: I0318 16:00:57.477670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerStarted","Data":"f14a91904135041b3a7524f59eec817cf238343acd52de8dd0c02254dd7094d3"} Mar 18 16:00:57 crc kubenswrapper[4792]: I0318 16:00:57.480018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b391e203-fca6-4bcb-870c-d04691525743","Type":"ContainerStarted","Data":"de6f43b91b06e379fd2c3c9969e0104546b7389afd41814dbc6cc810a780c5d9"} Mar 18 16:00:58 crc kubenswrapper[4792]: I0318 16:00:58.417940 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:00:58 crc kubenswrapper[4792]: I0318 16:00:58.511599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b391e203-fca6-4bcb-870c-d04691525743","Type":"ContainerStarted","Data":"7e4a41d29f9ec34448e79f784c6d6ad074dc2a1d77db60a7345b55c85bb13dde"} Mar 18 16:00:58 crc kubenswrapper[4792]: I0318 16:00:58.513427 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 16:00:58 crc kubenswrapper[4792]: I0318 16:00:58.532244 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerStarted","Data":"d2d1a4b7bec6c8ad3e45574c16ca5141bad8ed286fc1695beaee1312afed7b15"} Mar 18 16:00:58 crc kubenswrapper[4792]: I0318 16:00:58.541180 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.5411544839999998 podStartE2EDuration="2.541154484s" podCreationTimestamp="2026-03-18 16:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:00:58.527355057 +0000 UTC m=+1607.396683994" watchObservedRunningTime="2026-03-18 16:00:58.541154484 +0000 UTC m=+1607.410483421" Mar 18 16:00:59 crc kubenswrapper[4792]: I0318 16:00:59.943471 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-92bs2" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:00:59 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:00:59 crc kubenswrapper[4792]: > Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.146015 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564161-5whqc"] Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.148380 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.166444 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564161-5whqc"] Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.199620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-fernet-keys\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.199700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-config-data\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.199872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6krfw\" (UniqueName: \"kubernetes.io/projected/b8f22059-bc37-4a08-911c-f38b0b38b322-kube-api-access-6krfw\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.199938 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-combined-ca-bundle\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.301985 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6krfw\" (UniqueName: \"kubernetes.io/projected/b8f22059-bc37-4a08-911c-f38b0b38b322-kube-api-access-6krfw\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.302092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-combined-ca-bundle\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.302145 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-fernet-keys\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.302199 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-config-data\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.310121 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-config-data\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.310138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-fernet-keys\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.311039 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-combined-ca-bundle\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.330311 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6krfw\" (UniqueName: \"kubernetes.io/projected/b8f22059-bc37-4a08-911c-f38b0b38b322-kube-api-access-6krfw\") pod \"keystone-cron-29564161-5whqc\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.537212 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.556295 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerStarted","Data":"11f4ca4543fc0d41a6cd5a12c402108f1762272f3fa78999c5e20fb2bc826dd5"} Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.557232 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:01:00 crc kubenswrapper[4792]: I0318 16:01:00.597166 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.283758798 podStartE2EDuration="6.597143291s" podCreationTimestamp="2026-03-18 16:00:54 +0000 UTC" firstStartedPulling="2026-03-18 16:00:55.671021947 +0000 UTC m=+1604.540350894" lastFinishedPulling="2026-03-18 16:00:59.98440645 +0000 UTC m=+1608.853735387" observedRunningTime="2026-03-18 16:01:00.581369004 +0000 UTC m=+1609.450697941" watchObservedRunningTime="2026-03-18 16:01:00.597143291 +0000 UTC m=+1609.466472248" Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.129472 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564161-5whqc"] Mar 18 16:01:01 crc kubenswrapper[4792]: W0318 16:01:01.159285 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f22059_bc37_4a08_911c_f38b0b38b322.slice/crio-db0acef61fd450f6db3bbc762ed03a327b35dc9d926676fdc8e0b35a9f9f8a09 WatchSource:0}: Error finding container db0acef61fd450f6db3bbc762ed03a327b35dc9d926676fdc8e0b35a9f9f8a09: Status 404 returned error can't find the container with id db0acef61fd450f6db3bbc762ed03a327b35dc9d926676fdc8e0b35a9f9f8a09 Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.567757 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-5whqc" event={"ID":"b8f22059-bc37-4a08-911c-f38b0b38b322","Type":"ContainerStarted","Data":"3ac351e60c86b8e76762ec6e04a0569fb1503268ea64a879906c5b83f238207d"} Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.568197 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-5whqc" event={"ID":"b8f22059-bc37-4a08-911c-f38b0b38b322","Type":"ContainerStarted","Data":"db0acef61fd450f6db3bbc762ed03a327b35dc9d926676fdc8e0b35a9f9f8a09"} Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.602608 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564161-5whqc" podStartSLOduration=1.602579883 podStartE2EDuration="1.602579883s" podCreationTimestamp="2026-03-18 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:01.588262801 +0000 UTC m=+1610.457591738" watchObservedRunningTime="2026-03-18 16:01:01.602579883 +0000 UTC m=+1610.471908830" Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.670611 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.670666 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.766486 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:01:01 crc kubenswrapper[4792]: I0318 16:01:01.766544 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.622639 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-s667r"] Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.624566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.627318 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5q5xg" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.627390 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.627567 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.627844 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.634713 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-s667r"] Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.674319 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6sqm\" (UniqueName: \"kubernetes.io/projected/b18757a2-4c26-41be-9348-ea8624af2527-kube-api-access-w6sqm\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.674382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-combined-ca-bundle\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.674466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-scripts\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.674773 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-config-data\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.755171 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.255:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.755456 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.255:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.779851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-config-data\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.780008 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6sqm\" (UniqueName: \"kubernetes.io/projected/b18757a2-4c26-41be-9348-ea8624af2527-kube-api-access-w6sqm\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.780029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-combined-ca-bundle\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.780071 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-scripts\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.781215 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.781548 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.787269 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-config-data\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.789094 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-scripts\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.802232 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6sqm\" (UniqueName: \"kubernetes.io/projected/b18757a2-4c26-41be-9348-ea8624af2527-kube-api-access-w6sqm\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.807677 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-combined-ca-bundle\") pod \"aodh-db-sync-s667r\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:02 crc kubenswrapper[4792]: I0318 16:01:02.970689 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:03 crc kubenswrapper[4792]: I0318 16:01:03.431844 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 16:01:03 crc kubenswrapper[4792]: I0318 16:01:03.516100 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 16:01:03 crc kubenswrapper[4792]: I0318 16:01:03.664051 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 16:01:03 crc kubenswrapper[4792]: I0318 16:01:03.685841 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-s667r"] Mar 18 16:01:03 crc kubenswrapper[4792]: W0318 16:01:03.732924 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18757a2_4c26_41be_9348_ea8624af2527.slice/crio-f7980c0191f4ed35343841ab1b0e4ff1ae50fc304457c7b30493e659168af80f WatchSource:0}: Error finding container f7980c0191f4ed35343841ab1b0e4ff1ae50fc304457c7b30493e659168af80f: Status 404 returned error can't find the container with id f7980c0191f4ed35343841ab1b0e4ff1ae50fc304457c7b30493e659168af80f Mar 18 16:01:04 crc kubenswrapper[4792]: I0318 16:01:04.742554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-s667r" event={"ID":"b18757a2-4c26-41be-9348-ea8624af2527","Type":"ContainerStarted","Data":"f7980c0191f4ed35343841ab1b0e4ff1ae50fc304457c7b30493e659168af80f"} Mar 18 16:01:06 crc kubenswrapper[4792]: I0318 16:01:06.912502 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 16:01:07 crc kubenswrapper[4792]: I0318 16:01:07.798270 4792 generic.go:334] "Generic (PLEG): container finished" podID="b8f22059-bc37-4a08-911c-f38b0b38b322" containerID="3ac351e60c86b8e76762ec6e04a0569fb1503268ea64a879906c5b83f238207d" exitCode=0 Mar 18 16:01:07 crc kubenswrapper[4792]: I0318 16:01:07.798347 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-5whqc" event={"ID":"b8f22059-bc37-4a08-911c-f38b0b38b322","Type":"ContainerDied","Data":"3ac351e60c86b8e76762ec6e04a0569fb1503268ea64a879906c5b83f238207d"} Mar 18 16:01:08 crc kubenswrapper[4792]: I0318 16:01:08.854443 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:01:08 crc kubenswrapper[4792]: E0318 16:01:08.855314 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:01:08 crc kubenswrapper[4792]: I0318 16:01:08.892075 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:01:08 crc kubenswrapper[4792]: I0318 16:01:08.964812 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:01:09 crc kubenswrapper[4792]: I0318 16:01:09.148946 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92bs2"] Mar 18 16:01:09 crc kubenswrapper[4792]: I0318 16:01:09.670074 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:09 crc kubenswrapper[4792]: I0318 16:01:09.670444 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:09 crc kubenswrapper[4792]: I0318 16:01:09.766954 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:01:09 crc kubenswrapper[4792]: I0318 16:01:09.767102 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.770850 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.822562 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-combined-ca-bundle\") pod \"b8f22059-bc37-4a08-911c-f38b0b38b322\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.822903 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-fernet-keys\") pod \"b8f22059-bc37-4a08-911c-f38b0b38b322\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.823145 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-config-data\") pod \"b8f22059-bc37-4a08-911c-f38b0b38b322\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.823196 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6krfw\" (UniqueName: \"kubernetes.io/projected/b8f22059-bc37-4a08-911c-f38b0b38b322-kube-api-access-6krfw\") pod \"b8f22059-bc37-4a08-911c-f38b0b38b322\" (UID: \"b8f22059-bc37-4a08-911c-f38b0b38b322\") " Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.836088 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f22059-bc37-4a08-911c-f38b0b38b322-kube-api-access-6krfw" (OuterVolumeSpecName: "kube-api-access-6krfw") pod "b8f22059-bc37-4a08-911c-f38b0b38b322" (UID: "b8f22059-bc37-4a08-911c-f38b0b38b322"). InnerVolumeSpecName "kube-api-access-6krfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.838297 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b8f22059-bc37-4a08-911c-f38b0b38b322" (UID: "b8f22059-bc37-4a08-911c-f38b0b38b322"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.845612 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-92bs2" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="registry-server" containerID="cri-o://dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7" gracePeriod=2 Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.846123 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564161-5whqc" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.846127 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564161-5whqc" event={"ID":"b8f22059-bc37-4a08-911c-f38b0b38b322","Type":"ContainerDied","Data":"db0acef61fd450f6db3bbc762ed03a327b35dc9d926676fdc8e0b35a9f9f8a09"} Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.846513 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db0acef61fd450f6db3bbc762ed03a327b35dc9d926676fdc8e0b35a9f9f8a09" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.899621 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8f22059-bc37-4a08-911c-f38b0b38b322" (UID: "b8f22059-bc37-4a08-911c-f38b0b38b322"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.926060 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-config-data" (OuterVolumeSpecName: "config-data") pod "b8f22059-bc37-4a08-911c-f38b0b38b322" (UID: "b8f22059-bc37-4a08-911c-f38b0b38b322"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.926191 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6krfw\" (UniqueName: \"kubernetes.io/projected/b8f22059-bc37-4a08-911c-f38b0b38b322-kube-api-access-6krfw\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.926456 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:10 crc kubenswrapper[4792]: I0318 16:01:10.926467 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.029356 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f22059-bc37-4a08-911c-f38b0b38b322-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.431617 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.448564 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxq96\" (UniqueName: \"kubernetes.io/projected/e8325ad2-da8e-4a5a-b759-57a468b289b7-kube-api-access-vxq96\") pod \"e8325ad2-da8e-4a5a-b759-57a468b289b7\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.449058 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-config-data\") pod \"e8325ad2-da8e-4a5a-b759-57a468b289b7\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.449177 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-combined-ca-bundle\") pod \"e8325ad2-da8e-4a5a-b759-57a468b289b7\" (UID: \"e8325ad2-da8e-4a5a-b759-57a468b289b7\") " Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.455257 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8325ad2-da8e-4a5a-b759-57a468b289b7-kube-api-access-vxq96" (OuterVolumeSpecName: "kube-api-access-vxq96") pod "e8325ad2-da8e-4a5a-b759-57a468b289b7" (UID: "e8325ad2-da8e-4a5a-b759-57a468b289b7"). InnerVolumeSpecName "kube-api-access-vxq96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.511080 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8325ad2-da8e-4a5a-b759-57a468b289b7" (UID: "e8325ad2-da8e-4a5a-b759-57a468b289b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.515759 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-config-data" (OuterVolumeSpecName: "config-data") pod "e8325ad2-da8e-4a5a-b759-57a468b289b7" (UID: "e8325ad2-da8e-4a5a-b759-57a468b289b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.553069 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.553120 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8325ad2-da8e-4a5a-b759-57a468b289b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.553141 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxq96\" (UniqueName: \"kubernetes.io/projected/e8325ad2-da8e-4a5a-b759-57a468b289b7-kube-api-access-vxq96\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.641012 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.688182 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.691302 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.700043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.766237 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2llk\" (UniqueName: \"kubernetes.io/projected/17418767-af15-46e0-b37e-0c1d8102a2e6-kube-api-access-p2llk\") pod \"17418767-af15-46e0-b37e-0c1d8102a2e6\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.766344 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-catalog-content\") pod \"17418767-af15-46e0-b37e-0c1d8102a2e6\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.766616 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-utilities\") pod \"17418767-af15-46e0-b37e-0c1d8102a2e6\" (UID: \"17418767-af15-46e0-b37e-0c1d8102a2e6\") " Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.768911 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-utilities" (OuterVolumeSpecName: "utilities") pod "17418767-af15-46e0-b37e-0c1d8102a2e6" (UID: "17418767-af15-46e0-b37e-0c1d8102a2e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.785465 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.785592 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.796235 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.804250 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17418767-af15-46e0-b37e-0c1d8102a2e6-kube-api-access-p2llk" (OuterVolumeSpecName: "kube-api-access-p2llk") pod "17418767-af15-46e0-b37e-0c1d8102a2e6" (UID: "17418767-af15-46e0-b37e-0c1d8102a2e6"). InnerVolumeSpecName "kube-api-access-p2llk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.805847 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.830302 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17418767-af15-46e0-b37e-0c1d8102a2e6" (UID: "17418767-af15-46e0-b37e-0c1d8102a2e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.876604 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.876656 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2llk\" (UniqueName: \"kubernetes.io/projected/17418767-af15-46e0-b37e-0c1d8102a2e6-kube-api-access-p2llk\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.876671 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17418767-af15-46e0-b37e-0c1d8102a2e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.887130 4792 generic.go:334] "Generic (PLEG): container finished" podID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerID="dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7" exitCode=0 Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.889220 4792 generic.go:334] "Generic (PLEG): container finished" podID="e8325ad2-da8e-4a5a-b759-57a468b289b7" containerID="9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7" exitCode=137 Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.909936 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.910606 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92bs2" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.914868 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-s667r" event={"ID":"b18757a2-4c26-41be-9348-ea8624af2527","Type":"ContainerStarted","Data":"0ab2646bb2f66bfaa423606ab3cb52b9cf7ad5033855396c58f9ecbd885b3a62"} Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.915203 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92bs2" event={"ID":"17418767-af15-46e0-b37e-0c1d8102a2e6","Type":"ContainerDied","Data":"dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7"} Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.915305 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92bs2" event={"ID":"17418767-af15-46e0-b37e-0c1d8102a2e6","Type":"ContainerDied","Data":"0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46"} Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.915442 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8325ad2-da8e-4a5a-b759-57a468b289b7","Type":"ContainerDied","Data":"9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7"} Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.915523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e8325ad2-da8e-4a5a-b759-57a468b289b7","Type":"ContainerDied","Data":"56397a19b13949d2593fef48abb21c20fe39bab38f82eee45f7dc7df53fdfcfb"} Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.915349 4792 scope.go:117] "RemoveContainer" containerID="dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.932546 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:01:11 crc kubenswrapper[4792]: I0318 16:01:11.975684 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-s667r" podStartSLOduration=3.141734792 podStartE2EDuration="9.975635653s" podCreationTimestamp="2026-03-18 16:01:02 +0000 UTC" firstStartedPulling="2026-03-18 16:01:03.738514373 +0000 UTC m=+1612.607843310" lastFinishedPulling="2026-03-18 16:01:10.572415234 +0000 UTC m=+1619.441744171" observedRunningTime="2026-03-18 16:01:11.969909896 +0000 UTC m=+1620.839238843" watchObservedRunningTime="2026-03-18 16:01:11.975635653 +0000 UTC m=+1620.844964590" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.013471 4792 scope.go:117] "RemoveContainer" containerID="06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.096794 4792 scope.go:117] "RemoveContainer" containerID="6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.133860 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.216057 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.272902 4792 scope.go:117] "RemoveContainer" containerID="dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.274397 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7\": container with ID starting with dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7 not found: ID does not exist" containerID="dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.274444 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7"} err="failed to get container status \"dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7\": rpc error: code = NotFound desc = could not find container \"dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7\": container with ID starting with dde50375690542b122fcc75a7df6e0b24e5ab20a715a30481777083ac0f23bb7 not found: ID does not exist" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.274477 4792 scope.go:117] "RemoveContainer" containerID="06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.275605 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa\": container with ID starting with 06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa not found: ID does not exist" containerID="06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.275628 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa"} err="failed to get container status \"06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa\": rpc error: code = NotFound desc = could not find container \"06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa\": container with ID starting with 06c8a59191cb0d6fcbf64cf0203612470715912f35853525359f9acf5059f1fa not found: ID does not exist" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.275647 4792 scope.go:117] "RemoveContainer" containerID="6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.287596 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade\": container with ID starting with 6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade not found: ID does not exist" containerID="6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.287640 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade"} err="failed to get container status \"6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade\": rpc error: code = NotFound desc = could not find container \"6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade\": container with ID starting with 6ee3d5ef7357b056fe700da4923fccd1571f93bab83cad30f90705b00aea3ade not found: ID does not exist" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.287669 4792 scope.go:117] "RemoveContainer" containerID="9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.329682 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92bs2"] Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.372038 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.372715 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="extract-content" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.372741 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="extract-content" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.372763 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="registry-server" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.372772 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="registry-server" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.372812 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f22059-bc37-4a08-911c-f38b0b38b322" containerName="keystone-cron" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.372823 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f22059-bc37-4a08-911c-f38b0b38b322" containerName="keystone-cron" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.372836 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8325ad2-da8e-4a5a-b759-57a468b289b7" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.372844 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8325ad2-da8e-4a5a-b759-57a468b289b7" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.372858 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="extract-utilities" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.372867 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="extract-utilities" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.373242 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8325ad2-da8e-4a5a-b759-57a468b289b7" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.373294 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" containerName="registry-server" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.373304 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f22059-bc37-4a08-911c-f38b0b38b322" containerName="keystone-cron" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.374434 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.378526 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.378768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.378945 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.408042 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-92bs2"] Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.416612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.421480 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.421710 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.421909 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.422056 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sf8z\" (UniqueName: \"kubernetes.io/projected/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-kube-api-access-9sf8z\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.442789 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.462152 4792 scope.go:117] "RemoveContainer" containerID="9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7" Mar 18 16:01:12 crc kubenswrapper[4792]: E0318 16:01:12.472170 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7\": container with ID starting with 9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7 not found: ID does not exist" containerID="9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.472419 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7"} err="failed to get container status \"9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7\": rpc error: code = NotFound desc = could not find container \"9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7\": container with ID starting with 9581c4216bdc241ea461f2bd79cc9b9881d0d1b5ee3c187fd15c3c2b5f10bba7 not found: ID does not exist" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.477044 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-vs6l6"] Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.479126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.503019 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-vs6l6"] Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524242 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524339 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524472 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524526 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-config\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsptf\" (UniqueName: \"kubernetes.io/projected/40fc9569-2619-4401-905f-7f39df040ecb-kube-api-access-fsptf\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.524687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sf8z\" (UniqueName: \"kubernetes.io/projected/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-kube-api-access-9sf8z\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.541111 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.541577 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.556648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.557432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.577671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sf8z\" (UniqueName: \"kubernetes.io/projected/4411b06a-98e3-4eb2-bfa9-cf954a003e3a-kube-api-access-9sf8z\") pod \"nova-cell1-novncproxy-0\" (UID: \"4411b06a-98e3-4eb2-bfa9-cf954a003e3a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.626549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.626827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.626879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.626928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-config\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.627032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsptf\" (UniqueName: \"kubernetes.io/projected/40fc9569-2619-4401-905f-7f39df040ecb-kube-api-access-fsptf\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.627151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.628023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.628764 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.629569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.629831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-config\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.630538 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.654659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsptf\" (UniqueName: \"kubernetes.io/projected/40fc9569-2619-4401-905f-7f39df040ecb-kube-api-access-fsptf\") pod \"dnsmasq-dns-f84f9ccf-vs6l6\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.743998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:12 crc kubenswrapper[4792]: I0318 16:01:12.819884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:13 crc kubenswrapper[4792]: E0318 16:01:13.235248 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:13 crc kubenswrapper[4792]: I0318 16:01:13.472070 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 16:01:13 crc kubenswrapper[4792]: W0318 16:01:13.484196 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4411b06a_98e3_4eb2_bfa9_cf954a003e3a.slice/crio-796f959ddd368136cb96a1ec58489d9737026fe83b4094b91c8c684b6d06024b WatchSource:0}: Error finding container 796f959ddd368136cb96a1ec58489d9737026fe83b4094b91c8c684b6d06024b: Status 404 returned error can't find the container with id 796f959ddd368136cb96a1ec58489d9737026fe83b4094b91c8c684b6d06024b Mar 18 16:01:13 crc kubenswrapper[4792]: I0318 16:01:13.752388 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-vs6l6"] Mar 18 16:01:13 crc kubenswrapper[4792]: I0318 16:01:13.870563 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17418767-af15-46e0-b37e-0c1d8102a2e6" path="/var/lib/kubelet/pods/17418767-af15-46e0-b37e-0c1d8102a2e6/volumes" Mar 18 16:01:13 crc kubenswrapper[4792]: I0318 16:01:13.871917 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8325ad2-da8e-4a5a-b759-57a468b289b7" path="/var/lib/kubelet/pods/e8325ad2-da8e-4a5a-b759-57a468b289b7/volumes" Mar 18 16:01:13 crc kubenswrapper[4792]: I0318 16:01:13.937139 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4411b06a-98e3-4eb2-bfa9-cf954a003e3a","Type":"ContainerStarted","Data":"796f959ddd368136cb96a1ec58489d9737026fe83b4094b91c8c684b6d06024b"} Mar 18 16:01:13 crc kubenswrapper[4792]: I0318 16:01:13.941196 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" event={"ID":"40fc9569-2619-4401-905f-7f39df040ecb","Type":"ContainerStarted","Data":"17bc40311c9c163fac4f46df28616bd35ef49185d75b4df718a892850a559727"} Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.029248 4792 generic.go:334] "Generic (PLEG): container finished" podID="40fc9569-2619-4401-905f-7f39df040ecb" containerID="8ca544437284785d8d1298a4681da252c571b07a9d2044d3b570597a54ca847e" exitCode=0 Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.033684 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" event={"ID":"40fc9569-2619-4401-905f-7f39df040ecb","Type":"ContainerDied","Data":"8ca544437284785d8d1298a4681da252c571b07a9d2044d3b570597a54ca847e"} Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.045560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4411b06a-98e3-4eb2-bfa9-cf954a003e3a","Type":"ContainerStarted","Data":"3d2f19f7824c16306ef5b204416af265941f4085999569ec6b55e015721c1e62"} Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.102112 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.102089123 podStartE2EDuration="3.102089123s" podCreationTimestamp="2026-03-18 16:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:15.080334051 +0000 UTC m=+1623.949663008" watchObservedRunningTime="2026-03-18 16:01:15.102089123 +0000 UTC m=+1623.971418060" Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.559784 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.560361 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-central-agent" containerID="cri-o://85650a87eb009642b4b2638b8d3467e735b84e5c12ce1f87b5f87581ef221783" gracePeriod=30 Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.560501 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-notification-agent" containerID="cri-o://f14a91904135041b3a7524f59eec817cf238343acd52de8dd0c02254dd7094d3" gracePeriod=30 Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.560512 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="proxy-httpd" containerID="cri-o://11f4ca4543fc0d41a6cd5a12c402108f1762272f3fa78999c5e20fb2bc826dd5" gracePeriod=30 Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.561065 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="sg-core" containerID="cri-o://d2d1a4b7bec6c8ad3e45574c16ca5141bad8ed286fc1695beaee1312afed7b15" gracePeriod=30 Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.567475 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.4:3000/\": EOF" Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.691029 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.691407 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-log" containerID="cri-o://c8d5fef621915875bcb901a686dd28c6f785784c9c1ae374c31d9d9d7ea67426" gracePeriod=30 Mar 18 16:01:15 crc kubenswrapper[4792]: I0318 16:01:15.691734 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-api" containerID="cri-o://8b85b235d026a4985ba81fd2b92992c067602c64c6c9d7c619f4fc2a2b61af19" gracePeriod=30 Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.061589 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" event={"ID":"40fc9569-2619-4401-905f-7f39df040ecb","Type":"ContainerStarted","Data":"89c262d9a910bfee5d80883e1dba821067ea03428179bed79bb67704babc339a"} Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.062017 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.068545 4792 generic.go:334] "Generic (PLEG): container finished" podID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerID="11f4ca4543fc0d41a6cd5a12c402108f1762272f3fa78999c5e20fb2bc826dd5" exitCode=0 Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.068588 4792 generic.go:334] "Generic (PLEG): container finished" podID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerID="d2d1a4b7bec6c8ad3e45574c16ca5141bad8ed286fc1695beaee1312afed7b15" exitCode=2 Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.068637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerDied","Data":"11f4ca4543fc0d41a6cd5a12c402108f1762272f3fa78999c5e20fb2bc826dd5"} Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.068668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerDied","Data":"d2d1a4b7bec6c8ad3e45574c16ca5141bad8ed286fc1695beaee1312afed7b15"} Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.071256 4792 generic.go:334] "Generic (PLEG): container finished" podID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerID="c8d5fef621915875bcb901a686dd28c6f785784c9c1ae374c31d9d9d7ea67426" exitCode=143 Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.071303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47725242-ec4c-452b-8503-92cb2c5a65f6","Type":"ContainerDied","Data":"c8d5fef621915875bcb901a686dd28c6f785784c9c1ae374c31d9d9d7ea67426"} Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.073024 4792 generic.go:334] "Generic (PLEG): container finished" podID="b18757a2-4c26-41be-9348-ea8624af2527" containerID="0ab2646bb2f66bfaa423606ab3cb52b9cf7ad5033855396c58f9ecbd885b3a62" exitCode=0 Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.073095 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-s667r" event={"ID":"b18757a2-4c26-41be-9348-ea8624af2527","Type":"ContainerDied","Data":"0ab2646bb2f66bfaa423606ab3cb52b9cf7ad5033855396c58f9ecbd885b3a62"} Mar 18 16:01:16 crc kubenswrapper[4792]: I0318 16:01:16.102508 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" podStartSLOduration=4.10248278 podStartE2EDuration="4.10248278s" podCreationTimestamp="2026-03-18 16:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:16.100755557 +0000 UTC m=+1624.970084504" watchObservedRunningTime="2026-03-18 16:01:16.10248278 +0000 UTC m=+1624.971811717" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.090651 4792 generic.go:334] "Generic (PLEG): container finished" podID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerID="f14a91904135041b3a7524f59eec817cf238343acd52de8dd0c02254dd7094d3" exitCode=0 Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.091112 4792 generic.go:334] "Generic (PLEG): container finished" podID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerID="85650a87eb009642b4b2638b8d3467e735b84e5c12ce1f87b5f87581ef221783" exitCode=0 Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.090878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerDied","Data":"f14a91904135041b3a7524f59eec817cf238343acd52de8dd0c02254dd7094d3"} Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.091356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerDied","Data":"85650a87eb009642b4b2638b8d3467e735b84e5c12ce1f87b5f87581ef221783"} Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.652766 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.700639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-run-httpd\") pod \"1bb99d4b-c646-4518-af8d-d7994f61c47c\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.701067 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-config-data\") pod \"1bb99d4b-c646-4518-af8d-d7994f61c47c\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.701397 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-scripts\") pod \"1bb99d4b-c646-4518-af8d-d7994f61c47c\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.701439 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-sg-core-conf-yaml\") pod \"1bb99d4b-c646-4518-af8d-d7994f61c47c\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.701480 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-combined-ca-bundle\") pod \"1bb99d4b-c646-4518-af8d-d7994f61c47c\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.701579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn44p\" (UniqueName: \"kubernetes.io/projected/1bb99d4b-c646-4518-af8d-d7994f61c47c-kube-api-access-xn44p\") pod \"1bb99d4b-c646-4518-af8d-d7994f61c47c\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.701727 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-log-httpd\") pod \"1bb99d4b-c646-4518-af8d-d7994f61c47c\" (UID: \"1bb99d4b-c646-4518-af8d-d7994f61c47c\") " Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.703011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1bb99d4b-c646-4518-af8d-d7994f61c47c" (UID: "1bb99d4b-c646-4518-af8d-d7994f61c47c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.703394 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1bb99d4b-c646-4518-af8d-d7994f61c47c" (UID: "1bb99d4b-c646-4518-af8d-d7994f61c47c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.736494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-scripts" (OuterVolumeSpecName: "scripts") pod "1bb99d4b-c646-4518-af8d-d7994f61c47c" (UID: "1bb99d4b-c646-4518-af8d-d7994f61c47c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.744443 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.745929 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb99d4b-c646-4518-af8d-d7994f61c47c-kube-api-access-xn44p" (OuterVolumeSpecName: "kube-api-access-xn44p") pod "1bb99d4b-c646-4518-af8d-d7994f61c47c" (UID: "1bb99d4b-c646-4518-af8d-d7994f61c47c"). InnerVolumeSpecName "kube-api-access-xn44p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.757514 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1bb99d4b-c646-4518-af8d-d7994f61c47c" (UID: "1bb99d4b-c646-4518-af8d-d7994f61c47c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.815815 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.815855 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.815885 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn44p\" (UniqueName: \"kubernetes.io/projected/1bb99d4b-c646-4518-af8d-d7994f61c47c-kube-api-access-xn44p\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.815901 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.815913 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bb99d4b-c646-4518-af8d-d7994f61c47c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.853936 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bb99d4b-c646-4518-af8d-d7994f61c47c" (UID: "1bb99d4b-c646-4518-af8d-d7994f61c47c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.882382 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-config-data" (OuterVolumeSpecName: "config-data") pod "1bb99d4b-c646-4518-af8d-d7994f61c47c" (UID: "1bb99d4b-c646-4518-af8d-d7994f61c47c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.919701 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:17 crc kubenswrapper[4792]: I0318 16:01:17.919733 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb99d4b-c646-4518-af8d-d7994f61c47c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.003476 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.021891 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-scripts\") pod \"b18757a2-4c26-41be-9348-ea8624af2527\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.022190 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6sqm\" (UniqueName: \"kubernetes.io/projected/b18757a2-4c26-41be-9348-ea8624af2527-kube-api-access-w6sqm\") pod \"b18757a2-4c26-41be-9348-ea8624af2527\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.022228 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-config-data\") pod \"b18757a2-4c26-41be-9348-ea8624af2527\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.022672 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-combined-ca-bundle\") pod \"b18757a2-4c26-41be-9348-ea8624af2527\" (UID: \"b18757a2-4c26-41be-9348-ea8624af2527\") " Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.029099 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-scripts" (OuterVolumeSpecName: "scripts") pod "b18757a2-4c26-41be-9348-ea8624af2527" (UID: "b18757a2-4c26-41be-9348-ea8624af2527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.029447 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18757a2-4c26-41be-9348-ea8624af2527-kube-api-access-w6sqm" (OuterVolumeSpecName: "kube-api-access-w6sqm") pod "b18757a2-4c26-41be-9348-ea8624af2527" (UID: "b18757a2-4c26-41be-9348-ea8624af2527"). InnerVolumeSpecName "kube-api-access-w6sqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.060878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-config-data" (OuterVolumeSpecName: "config-data") pod "b18757a2-4c26-41be-9348-ea8624af2527" (UID: "b18757a2-4c26-41be-9348-ea8624af2527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.106963 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b18757a2-4c26-41be-9348-ea8624af2527" (UID: "b18757a2-4c26-41be-9348-ea8624af2527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.107611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-s667r" event={"ID":"b18757a2-4c26-41be-9348-ea8624af2527","Type":"ContainerDied","Data":"f7980c0191f4ed35343841ab1b0e4ff1ae50fc304457c7b30493e659168af80f"} Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.107824 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7980c0191f4ed35343841ab1b0e4ff1ae50fc304457c7b30493e659168af80f" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.107694 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-s667r" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.118275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bb99d4b-c646-4518-af8d-d7994f61c47c","Type":"ContainerDied","Data":"22e679d883c9b481d310020eacf07b3e4ad5b3f56181dfe75941977a58de2a5c"} Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.118337 4792 scope.go:117] "RemoveContainer" containerID="11f4ca4543fc0d41a6cd5a12c402108f1762272f3fa78999c5e20fb2bc826dd5" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.118344 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.129373 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6sqm\" (UniqueName: \"kubernetes.io/projected/b18757a2-4c26-41be-9348-ea8624af2527-kube-api-access-w6sqm\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.129655 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.129756 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.129833 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b18757a2-4c26-41be-9348-ea8624af2527-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.242067 4792 scope.go:117] "RemoveContainer" containerID="d2d1a4b7bec6c8ad3e45574c16ca5141bad8ed286fc1695beaee1312afed7b15" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.255873 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.270489 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.291144 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:18 crc kubenswrapper[4792]: E0318 16:01:18.293861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-central-agent" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.293899 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-central-agent" Mar 18 16:01:18 crc kubenswrapper[4792]: E0318 16:01:18.293929 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="proxy-httpd" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.293942 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="proxy-httpd" Mar 18 16:01:18 crc kubenswrapper[4792]: E0318 16:01:18.294034 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-notification-agent" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294046 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-notification-agent" Mar 18 16:01:18 crc kubenswrapper[4792]: E0318 16:01:18.294082 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="sg-core" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294091 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="sg-core" Mar 18 16:01:18 crc kubenswrapper[4792]: E0318 16:01:18.294104 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18757a2-4c26-41be-9348-ea8624af2527" containerName="aodh-db-sync" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294111 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18757a2-4c26-41be-9348-ea8624af2527" containerName="aodh-db-sync" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294410 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-central-agent" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294435 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="sg-core" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294451 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="proxy-httpd" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294464 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" containerName="ceilometer-notification-agent" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.294480 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18757a2-4c26-41be-9348-ea8624af2527" containerName="aodh-db-sync" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.296186 4792 scope.go:117] "RemoveContainer" containerID="f14a91904135041b3a7524f59eec817cf238343acd52de8dd0c02254dd7094d3" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.297425 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.304388 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.329900 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.337836 4792 scope.go:117] "RemoveContainer" containerID="85650a87eb009642b4b2638b8d3467e735b84e5c12ce1f87b5f87581ef221783" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.361334 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.441436 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-run-httpd\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.441608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.442054 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-log-httpd\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.442402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.442783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-config-data\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.443144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4x7\" (UniqueName: \"kubernetes.io/projected/f784cab2-894a-47c1-859e-306511a63186-kube-api-access-nt4x7\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.443323 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-scripts\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.545657 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-log-httpd\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.545779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.545862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-config-data\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.545936 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4x7\" (UniqueName: \"kubernetes.io/projected/f784cab2-894a-47c1-859e-306511a63186-kube-api-access-nt4x7\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.546010 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-scripts\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.546156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-run-httpd\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.546187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.546457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-log-httpd\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.547097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-run-httpd\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.554459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.554687 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-scripts\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.554730 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.554823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-config-data\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.562732 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:18 crc kubenswrapper[4792]: E0318 16:01:18.565501 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-nt4x7], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="f784cab2-894a-47c1-859e-306511a63186" Mar 18 16:01:18 crc kubenswrapper[4792]: I0318 16:01:18.569541 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4x7\" (UniqueName: \"kubernetes.io/projected/f784cab2-894a-47c1-859e-306511a63186-kube-api-access-nt4x7\") pod \"ceilometer-0\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " pod="openstack/ceilometer-0" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.141397 4792 generic.go:334] "Generic (PLEG): container finished" podID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerID="8b85b235d026a4985ba81fd2b92992c067602c64c6c9d7c619f4fc2a2b61af19" exitCode=0 Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.141648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.141495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47725242-ec4c-452b-8503-92cb2c5a65f6","Type":"ContainerDied","Data":"8b85b235d026a4985ba81fd2b92992c067602c64c6c9d7c619f4fc2a2b61af19"} Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.519901 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.672916 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-config-data\") pod \"f784cab2-894a-47c1-859e-306511a63186\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.673034 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-log-httpd\") pod \"f784cab2-894a-47c1-859e-306511a63186\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.673076 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-run-httpd\") pod \"f784cab2-894a-47c1-859e-306511a63186\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.673134 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-scripts\") pod \"f784cab2-894a-47c1-859e-306511a63186\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.673163 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt4x7\" (UniqueName: \"kubernetes.io/projected/f784cab2-894a-47c1-859e-306511a63186-kube-api-access-nt4x7\") pod \"f784cab2-894a-47c1-859e-306511a63186\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.673247 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-combined-ca-bundle\") pod \"f784cab2-894a-47c1-859e-306511a63186\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.673305 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-sg-core-conf-yaml\") pod \"f784cab2-894a-47c1-859e-306511a63186\" (UID: \"f784cab2-894a-47c1-859e-306511a63186\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.673871 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f784cab2-894a-47c1-859e-306511a63186" (UID: "f784cab2-894a-47c1-859e-306511a63186"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.674396 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f784cab2-894a-47c1-859e-306511a63186" (UID: "f784cab2-894a-47c1-859e-306511a63186"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.679738 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f784cab2-894a-47c1-859e-306511a63186" (UID: "f784cab2-894a-47c1-859e-306511a63186"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.679843 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f784cab2-894a-47c1-859e-306511a63186-kube-api-access-nt4x7" (OuterVolumeSpecName: "kube-api-access-nt4x7") pod "f784cab2-894a-47c1-859e-306511a63186" (UID: "f784cab2-894a-47c1-859e-306511a63186"). InnerVolumeSpecName "kube-api-access-nt4x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.680000 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f784cab2-894a-47c1-859e-306511a63186" (UID: "f784cab2-894a-47c1-859e-306511a63186"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.681563 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-scripts" (OuterVolumeSpecName: "scripts") pod "f784cab2-894a-47c1-859e-306511a63186" (UID: "f784cab2-894a-47c1-859e-306511a63186"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.682335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-config-data" (OuterVolumeSpecName: "config-data") pod "f784cab2-894a-47c1-859e-306511a63186" (UID: "f784cab2-894a-47c1-859e-306511a63186"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.775642 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.775679 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.775688 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f784cab2-894a-47c1-859e-306511a63186-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.775698 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.775709 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt4x7\" (UniqueName: \"kubernetes.io/projected/f784cab2-894a-47c1-859e-306511a63186-kube-api-access-nt4x7\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.775718 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.775727 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f784cab2-894a-47c1-859e-306511a63186-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.811178 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.877016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-combined-ca-bundle\") pod \"47725242-ec4c-452b-8503-92cb2c5a65f6\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.877613 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47725242-ec4c-452b-8503-92cb2c5a65f6-logs\") pod \"47725242-ec4c-452b-8503-92cb2c5a65f6\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.877747 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phnzj\" (UniqueName: \"kubernetes.io/projected/47725242-ec4c-452b-8503-92cb2c5a65f6-kube-api-access-phnzj\") pod \"47725242-ec4c-452b-8503-92cb2c5a65f6\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.877779 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-config-data\") pod \"47725242-ec4c-452b-8503-92cb2c5a65f6\" (UID: \"47725242-ec4c-452b-8503-92cb2c5a65f6\") " Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.878536 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47725242-ec4c-452b-8503-92cb2c5a65f6-logs" (OuterVolumeSpecName: "logs") pod "47725242-ec4c-452b-8503-92cb2c5a65f6" (UID: "47725242-ec4c-452b-8503-92cb2c5a65f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.878916 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47725242-ec4c-452b-8503-92cb2c5a65f6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.883211 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47725242-ec4c-452b-8503-92cb2c5a65f6-kube-api-access-phnzj" (OuterVolumeSpecName: "kube-api-access-phnzj") pod "47725242-ec4c-452b-8503-92cb2c5a65f6" (UID: "47725242-ec4c-452b-8503-92cb2c5a65f6"). InnerVolumeSpecName "kube-api-access-phnzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.886685 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb99d4b-c646-4518-af8d-d7994f61c47c" path="/var/lib/kubelet/pods/1bb99d4b-c646-4518-af8d-d7994f61c47c/volumes" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.942603 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-config-data" (OuterVolumeSpecName: "config-data") pod "47725242-ec4c-452b-8503-92cb2c5a65f6" (UID: "47725242-ec4c-452b-8503-92cb2c5a65f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.944210 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47725242-ec4c-452b-8503-92cb2c5a65f6" (UID: "47725242-ec4c-452b-8503-92cb2c5a65f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.984693 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.984737 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phnzj\" (UniqueName: \"kubernetes.io/projected/47725242-ec4c-452b-8503-92cb2c5a65f6-kube-api-access-phnzj\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:19 crc kubenswrapper[4792]: I0318 16:01:19.984747 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47725242-ec4c-452b-8503-92cb2c5a65f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.156431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.156439 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.157054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47725242-ec4c-452b-8503-92cb2c5a65f6","Type":"ContainerDied","Data":"fbd7f975405458ad9997710b4785746a66b73afc0cb8605cf24ec84f4f1fdda4"} Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.157156 4792 scope.go:117] "RemoveContainer" containerID="8b85b235d026a4985ba81fd2b92992c067602c64c6c9d7c619f4fc2a2b61af19" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.179550 4792 scope.go:117] "RemoveContainer" containerID="c8d5fef621915875bcb901a686dd28c6f785784c9c1ae374c31d9d9d7ea67426" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.233048 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.252603 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.269892 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: E0318 16:01:20.270454 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-api" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.270480 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-api" Mar 18 16:01:20 crc kubenswrapper[4792]: E0318 16:01:20.270525 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-log" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.270535 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-log" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.270746 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-log" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.270781 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" containerName="nova-api-api" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.272904 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.276365 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.276636 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.290476 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.306226 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.319096 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.321721 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.327267 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.327504 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.327652 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.329892 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.344404 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398148 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-config-data\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de841d5a-8f14-4578-b8d7-e132eebadc2c-logs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd6rb\" (UniqueName: \"kubernetes.io/projected/2d10a886-f37c-4f20-ae2a-870f8e890205-kube-api-access-nd6rb\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d6wr\" (UniqueName: \"kubernetes.io/projected/de841d5a-8f14-4578-b8d7-e132eebadc2c-kube-api-access-7d6wr\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398781 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-log-httpd\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398953 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.398987 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-run-httpd\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.399010 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.399039 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-scripts\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.399104 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-config-data\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.399409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-public-tls-certs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.399459 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.501947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd6rb\" (UniqueName: \"kubernetes.io/projected/2d10a886-f37c-4f20-ae2a-870f8e890205-kube-api-access-nd6rb\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502144 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d6wr\" (UniqueName: \"kubernetes.io/projected/de841d5a-8f14-4578-b8d7-e132eebadc2c-kube-api-access-7d6wr\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502224 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-log-httpd\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502305 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-run-httpd\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502355 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502382 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-scripts\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-config-data\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502564 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-public-tls-certs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-log-httpd\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.502823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-run-httpd\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.503293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-config-data\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.503346 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de841d5a-8f14-4578-b8d7-e132eebadc2c-logs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.503767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de841d5a-8f14-4578-b8d7-e132eebadc2c-logs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.512096 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-scripts\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.512340 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.513759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-config-data\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.514416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.515992 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.517621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.520473 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-config-data\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.533566 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-public-tls-certs\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.543926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd6rb\" (UniqueName: \"kubernetes.io/projected/2d10a886-f37c-4f20-ae2a-870f8e890205-kube-api-access-nd6rb\") pod \"ceilometer-0\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.555646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d6wr\" (UniqueName: \"kubernetes.io/projected/de841d5a-8f14-4578-b8d7-e132eebadc2c-kube-api-access-7d6wr\") pod \"nova-api-0\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " pod="openstack/nova-api-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.591308 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:20 crc kubenswrapper[4792]: I0318 16:01:20.672754 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:21 crc kubenswrapper[4792]: I0318 16:01:21.195149 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:21 crc kubenswrapper[4792]: I0318 16:01:21.319859 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:21 crc kubenswrapper[4792]: W0318 16:01:21.322209 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde841d5a_8f14_4578_b8d7_e132eebadc2c.slice/crio-a383e0634620955f0596876f4f7bf78276f21db4bf3c578072f149d6e9a2b6ad WatchSource:0}: Error finding container a383e0634620955f0596876f4f7bf78276f21db4bf3c578072f149d6e9a2b6ad: Status 404 returned error can't find the container with id a383e0634620955f0596876f4f7bf78276f21db4bf3c578072f149d6e9a2b6ad Mar 18 16:01:21 crc kubenswrapper[4792]: I0318 16:01:21.870749 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47725242-ec4c-452b-8503-92cb2c5a65f6" path="/var/lib/kubelet/pods/47725242-ec4c-452b-8503-92cb2c5a65f6/volumes" Mar 18 16:01:21 crc kubenswrapper[4792]: I0318 16:01:21.878900 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f784cab2-894a-47c1-859e-306511a63186" path="/var/lib/kubelet/pods/f784cab2-894a-47c1-859e-306511a63186/volumes" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.155427 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.158931 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.162353 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5q5xg" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.162675 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.182161 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.203435 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de841d5a-8f14-4578-b8d7-e132eebadc2c","Type":"ContainerStarted","Data":"794e7cb78223401e500978468d4c8684ae038b683f3d0d00f44838f6eef4e214"} Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.203484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de841d5a-8f14-4578-b8d7-e132eebadc2c","Type":"ContainerStarted","Data":"d65c940eb809ba3c56bd4052caae015e727cc134cb3502a23bcd3f2fc02eb84a"} Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.203496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de841d5a-8f14-4578-b8d7-e132eebadc2c","Type":"ContainerStarted","Data":"a383e0634620955f0596876f4f7bf78276f21db4bf3c578072f149d6e9a2b6ad"} Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.207258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerStarted","Data":"655b2a3a9cdcc64ea17a85de2ecab35ce520652eb2f54bb37619fa4cf8248a7c"} Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.207287 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerStarted","Data":"03a98b9886ccbbf6211695b68e49e61efef71d149f82d10213868775aff64e54"} Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.211174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.252326 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.252309985 podStartE2EDuration="2.252309985s" podCreationTimestamp="2026-03-18 16:01:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:22.226362152 +0000 UTC m=+1631.095691089" watchObservedRunningTime="2026-03-18 16:01:22.252309985 +0000 UTC m=+1631.121638922" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.293771 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-config-data\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.293827 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-scripts\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.294208 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.294267 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8t7v\" (UniqueName: \"kubernetes.io/projected/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-kube-api-access-h8t7v\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.396671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.397021 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8t7v\" (UniqueName: \"kubernetes.io/projected/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-kube-api-access-h8t7v\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.397096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-config-data\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.397155 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-scripts\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.402459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.403359 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-config-data\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.403781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-scripts\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.420949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8t7v\" (UniqueName: \"kubernetes.io/projected/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-kube-api-access-h8t7v\") pod \"aodh-0\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.480615 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.744480 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.787130 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.822280 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:01:22 crc kubenswrapper[4792]: I0318 16:01:22.857552 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:01:22 crc kubenswrapper[4792]: E0318 16:01:22.858263 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.069329 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-mpgvp"] Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.070291 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" podUID="1372258e-0814-4692-9694-7dc28a519871" containerName="dnsmasq-dns" containerID="cri-o://d047c24c5ec62e5910239540f5637cac1516bdfb1ea1a76d7a82c98a343a06a3" gracePeriod=10 Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.099122 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.248188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerStarted","Data":"be67e25b35f59f321961c7cfeb36022b94d547d8ef4869f1f1fec9d55cdac091"} Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.288166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerStarted","Data":"e1d2bb81f32e35db05f03ab8f39a604a75f805c4bbf4826487468e3c015e6ec0"} Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.344206 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.692200 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dlvbw"] Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.694921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.705769 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.705915 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.773074 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlvbw"] Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.798017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-scripts\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.798116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbgbh\" (UniqueName: \"kubernetes.io/projected/6a77d6c0-2e1b-4b20-a323-8991335efabc-kube-api-access-vbgbh\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.798181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-config-data\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.798374 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: E0318 16:01:23.826784 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.902097 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-scripts\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.902434 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbgbh\" (UniqueName: \"kubernetes.io/projected/6a77d6c0-2e1b-4b20-a323-8991335efabc-kube-api-access-vbgbh\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.902483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-config-data\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.902652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.908566 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-scripts\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.912562 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.913268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-config-data\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: I0318 16:01:23.925673 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbgbh\" (UniqueName: \"kubernetes.io/projected/6a77d6c0-2e1b-4b20-a323-8991335efabc-kube-api-access-vbgbh\") pod \"nova-cell1-cell-mapping-dlvbw\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:23 crc kubenswrapper[4792]: E0318 16:01:23.964335 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.056164 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.335334 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerStarted","Data":"3044f1f48cb6f4fb2e217d92c04b84fdef0037fbff43be713e563b3579bd311e"} Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.340291 4792 generic.go:334] "Generic (PLEG): container finished" podID="1372258e-0814-4692-9694-7dc28a519871" containerID="d047c24c5ec62e5910239540f5637cac1516bdfb1ea1a76d7a82c98a343a06a3" exitCode=0 Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.340367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" event={"ID":"1372258e-0814-4692-9694-7dc28a519871","Type":"ContainerDied","Data":"d047c24c5ec62e5910239540f5637cac1516bdfb1ea1a76d7a82c98a343a06a3"} Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.340426 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" event={"ID":"1372258e-0814-4692-9694-7dc28a519871","Type":"ContainerDied","Data":"1a33052bfda33a8a9fb0cc9080c61991ac4d75feef290497a17f34e0950146c1"} Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.340444 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a33052bfda33a8a9fb0cc9080c61991ac4d75feef290497a17f34e0950146c1" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.340394 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.426992 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-config\") pod \"1372258e-0814-4692-9694-7dc28a519871\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.427240 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-nb\") pod \"1372258e-0814-4692-9694-7dc28a519871\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.427336 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrrgz\" (UniqueName: \"kubernetes.io/projected/1372258e-0814-4692-9694-7dc28a519871-kube-api-access-jrrgz\") pod \"1372258e-0814-4692-9694-7dc28a519871\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.427372 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-svc\") pod \"1372258e-0814-4692-9694-7dc28a519871\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.427398 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-swift-storage-0\") pod \"1372258e-0814-4692-9694-7dc28a519871\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.427433 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-sb\") pod \"1372258e-0814-4692-9694-7dc28a519871\" (UID: \"1372258e-0814-4692-9694-7dc28a519871\") " Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.444040 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1372258e-0814-4692-9694-7dc28a519871-kube-api-access-jrrgz" (OuterVolumeSpecName: "kube-api-access-jrrgz") pod "1372258e-0814-4692-9694-7dc28a519871" (UID: "1372258e-0814-4692-9694-7dc28a519871"). InnerVolumeSpecName "kube-api-access-jrrgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.530599 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrrgz\" (UniqueName: \"kubernetes.io/projected/1372258e-0814-4692-9694-7dc28a519871-kube-api-access-jrrgz\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.638028 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1372258e-0814-4692-9694-7dc28a519871" (UID: "1372258e-0814-4692-9694-7dc28a519871"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.645673 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.677768 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1372258e-0814-4692-9694-7dc28a519871" (UID: "1372258e-0814-4692-9694-7dc28a519871"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.690919 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1372258e-0814-4692-9694-7dc28a519871" (UID: "1372258e-0814-4692-9694-7dc28a519871"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.693876 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-config" (OuterVolumeSpecName: "config") pod "1372258e-0814-4692-9694-7dc28a519871" (UID: "1372258e-0814-4692-9694-7dc28a519871"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.717539 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlvbw"] Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.740018 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1372258e-0814-4692-9694-7dc28a519871" (UID: "1372258e-0814-4692-9694-7dc28a519871"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.748633 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.748677 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.748730 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:24 crc kubenswrapper[4792]: I0318 16:01:24.748741 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1372258e-0814-4692-9694-7dc28a519871-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.409505 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerStarted","Data":"1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc"} Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.416328 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-mpgvp" Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.417147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlvbw" event={"ID":"6a77d6c0-2e1b-4b20-a323-8991335efabc","Type":"ContainerStarted","Data":"1fb4e977184085feb898aaa04e01d3b12d3b2c0f491e933a04720786f6ea2b7b"} Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.417212 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlvbw" event={"ID":"6a77d6c0-2e1b-4b20-a323-8991335efabc","Type":"ContainerStarted","Data":"56c938f5daf1b9a0d7ef482fb94c2faacaa93f112fd5536eb081685f1dbcca74"} Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.447451 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dlvbw" podStartSLOduration=2.447430298 podStartE2EDuration="2.447430298s" podCreationTimestamp="2026-03-18 16:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:25.438947496 +0000 UTC m=+1634.308276433" watchObservedRunningTime="2026-03-18 16:01:25.447430298 +0000 UTC m=+1634.316759235" Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.475758 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-mpgvp"] Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.503704 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-mpgvp"] Mar 18 16:01:25 crc kubenswrapper[4792]: I0318 16:01:25.879100 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1372258e-0814-4692-9694-7dc28a519871" path="/var/lib/kubelet/pods/1372258e-0814-4692-9694-7dc28a519871/volumes" Mar 18 16:01:26 crc kubenswrapper[4792]: I0318 16:01:26.210350 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 16:01:27 crc kubenswrapper[4792]: I0318 16:01:27.443473 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerStarted","Data":"998cd0cf05d1ada1d78a255981ee6963d0a94c693d5da2662cf667c22d1a87d6"} Mar 18 16:01:27 crc kubenswrapper[4792]: I0318 16:01:27.443995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:01:27 crc kubenswrapper[4792]: I0318 16:01:27.450130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerStarted","Data":"712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59"} Mar 18 16:01:27 crc kubenswrapper[4792]: I0318 16:01:27.469024 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.389915169 podStartE2EDuration="7.469000762s" podCreationTimestamp="2026-03-18 16:01:20 +0000 UTC" firstStartedPulling="2026-03-18 16:01:21.199958273 +0000 UTC m=+1630.069287210" lastFinishedPulling="2026-03-18 16:01:26.279043866 +0000 UTC m=+1635.148372803" observedRunningTime="2026-03-18 16:01:27.466410312 +0000 UTC m=+1636.335739269" watchObservedRunningTime="2026-03-18 16:01:27.469000762 +0000 UTC m=+1636.338329699" Mar 18 16:01:28 crc kubenswrapper[4792]: I0318 16:01:28.475093 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:29 crc kubenswrapper[4792]: I0318 16:01:29.485098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerStarted","Data":"2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d"} Mar 18 16:01:29 crc kubenswrapper[4792]: I0318 16:01:29.487166 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-central-agent" containerID="cri-o://655b2a3a9cdcc64ea17a85de2ecab35ce520652eb2f54bb37619fa4cf8248a7c" gracePeriod=30 Mar 18 16:01:29 crc kubenswrapper[4792]: I0318 16:01:29.487354 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-notification-agent" containerID="cri-o://e1d2bb81f32e35db05f03ab8f39a604a75f805c4bbf4826487468e3c015e6ec0" gracePeriod=30 Mar 18 16:01:29 crc kubenswrapper[4792]: I0318 16:01:29.487195 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="proxy-httpd" containerID="cri-o://998cd0cf05d1ada1d78a255981ee6963d0a94c693d5da2662cf667c22d1a87d6" gracePeriod=30 Mar 18 16:01:29 crc kubenswrapper[4792]: I0318 16:01:29.487255 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="sg-core" containerID="cri-o://3044f1f48cb6f4fb2e217d92c04b84fdef0037fbff43be713e563b3579bd311e" gracePeriod=30 Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.500412 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerID="998cd0cf05d1ada1d78a255981ee6963d0a94c693d5da2662cf667c22d1a87d6" exitCode=0 Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.501017 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerID="3044f1f48cb6f4fb2e217d92c04b84fdef0037fbff43be713e563b3579bd311e" exitCode=2 Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.501032 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerID="e1d2bb81f32e35db05f03ab8f39a604a75f805c4bbf4826487468e3c015e6ec0" exitCode=0 Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.501060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerDied","Data":"998cd0cf05d1ada1d78a255981ee6963d0a94c693d5da2662cf667c22d1a87d6"} Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.501090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerDied","Data":"3044f1f48cb6f4fb2e217d92c04b84fdef0037fbff43be713e563b3579bd311e"} Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.501103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerDied","Data":"e1d2bb81f32e35db05f03ab8f39a604a75f805c4bbf4826487468e3c015e6ec0"} Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.674618 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:30 crc kubenswrapper[4792]: I0318 16:01:30.674710 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.516697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerStarted","Data":"543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5"} Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.517064 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-api" containerID="cri-o://1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc" gracePeriod=30 Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.517116 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-listener" containerID="cri-o://543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5" gracePeriod=30 Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.517171 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-notifier" containerID="cri-o://2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d" gracePeriod=30 Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.517228 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-evaluator" containerID="cri-o://712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59" gracePeriod=30 Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.552386 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.253994155 podStartE2EDuration="9.552357544s" podCreationTimestamp="2026-03-18 16:01:22 +0000 UTC" firstStartedPulling="2026-03-18 16:01:23.10514611 +0000 UTC m=+1631.974475047" lastFinishedPulling="2026-03-18 16:01:30.403509499 +0000 UTC m=+1639.272838436" observedRunningTime="2026-03-18 16:01:31.537211116 +0000 UTC m=+1640.406540053" watchObservedRunningTime="2026-03-18 16:01:31.552357544 +0000 UTC m=+1640.421686481" Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.686264 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.12:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:31 crc kubenswrapper[4792]: I0318 16:01:31.686579 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.12:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.539277 4792 generic.go:334] "Generic (PLEG): container finished" podID="6a77d6c0-2e1b-4b20-a323-8991335efabc" containerID="1fb4e977184085feb898aaa04e01d3b12d3b2c0f491e933a04720786f6ea2b7b" exitCode=0 Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.539630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlvbw" event={"ID":"6a77d6c0-2e1b-4b20-a323-8991335efabc","Type":"ContainerDied","Data":"1fb4e977184085feb898aaa04e01d3b12d3b2c0f491e933a04720786f6ea2b7b"} Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.545423 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerID="655b2a3a9cdcc64ea17a85de2ecab35ce520652eb2f54bb37619fa4cf8248a7c" exitCode=0 Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.545501 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerDied","Data":"655b2a3a9cdcc64ea17a85de2ecab35ce520652eb2f54bb37619fa4cf8248a7c"} Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.547793 4792 generic.go:334] "Generic (PLEG): container finished" podID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerID="712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59" exitCode=0 Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.547816 4792 generic.go:334] "Generic (PLEG): container finished" podID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerID="1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc" exitCode=0 Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.547883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerDied","Data":"712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59"} Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.547903 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerDied","Data":"1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc"} Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.644360 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.793694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-run-httpd\") pod \"2d10a886-f37c-4f20-ae2a-870f8e890205\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.793750 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-config-data\") pod \"2d10a886-f37c-4f20-ae2a-870f8e890205\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.793929 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-sg-core-conf-yaml\") pod \"2d10a886-f37c-4f20-ae2a-870f8e890205\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.793953 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-combined-ca-bundle\") pod \"2d10a886-f37c-4f20-ae2a-870f8e890205\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.794241 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2d10a886-f37c-4f20-ae2a-870f8e890205" (UID: "2d10a886-f37c-4f20-ae2a-870f8e890205"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.794717 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-scripts\") pod \"2d10a886-f37c-4f20-ae2a-870f8e890205\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.794900 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd6rb\" (UniqueName: \"kubernetes.io/projected/2d10a886-f37c-4f20-ae2a-870f8e890205-kube-api-access-nd6rb\") pod \"2d10a886-f37c-4f20-ae2a-870f8e890205\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.795272 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-log-httpd\") pod \"2d10a886-f37c-4f20-ae2a-870f8e890205\" (UID: \"2d10a886-f37c-4f20-ae2a-870f8e890205\") " Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.795835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2d10a886-f37c-4f20-ae2a-870f8e890205" (UID: "2d10a886-f37c-4f20-ae2a-870f8e890205"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.796839 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.796865 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d10a886-f37c-4f20-ae2a-870f8e890205-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.801278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-scripts" (OuterVolumeSpecName: "scripts") pod "2d10a886-f37c-4f20-ae2a-870f8e890205" (UID: "2d10a886-f37c-4f20-ae2a-870f8e890205"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.801446 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d10a886-f37c-4f20-ae2a-870f8e890205-kube-api-access-nd6rb" (OuterVolumeSpecName: "kube-api-access-nd6rb") pod "2d10a886-f37c-4f20-ae2a-870f8e890205" (UID: "2d10a886-f37c-4f20-ae2a-870f8e890205"). InnerVolumeSpecName "kube-api-access-nd6rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.835346 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2d10a886-f37c-4f20-ae2a-870f8e890205" (UID: "2d10a886-f37c-4f20-ae2a-870f8e890205"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.910758 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.911960 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.912002 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd6rb\" (UniqueName: \"kubernetes.io/projected/2d10a886-f37c-4f20-ae2a-870f8e890205-kube-api-access-nd6rb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.923124 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d10a886-f37c-4f20-ae2a-870f8e890205" (UID: "2d10a886-f37c-4f20-ae2a-870f8e890205"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:32 crc kubenswrapper[4792]: I0318 16:01:32.927943 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-config-data" (OuterVolumeSpecName: "config-data") pod "2d10a886-f37c-4f20-ae2a-870f8e890205" (UID: "2d10a886-f37c-4f20-ae2a-870f8e890205"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.015856 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.016630 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10a886-f37c-4f20-ae2a-870f8e890205-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.589432 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.589455 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d10a886-f37c-4f20-ae2a-870f8e890205","Type":"ContainerDied","Data":"03a98b9886ccbbf6211695b68e49e61efef71d149f82d10213868775aff64e54"} Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.589517 4792 scope.go:117] "RemoveContainer" containerID="998cd0cf05d1ada1d78a255981ee6963d0a94c693d5da2662cf667c22d1a87d6" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.632737 4792 scope.go:117] "RemoveContainer" containerID="3044f1f48cb6f4fb2e217d92c04b84fdef0037fbff43be713e563b3579bd311e" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.657562 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.683645 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.694169 4792 scope.go:117] "RemoveContainer" containerID="e1d2bb81f32e35db05f03ab8f39a604a75f805c4bbf4826487468e3c015e6ec0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.699129 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:33 crc kubenswrapper[4792]: E0318 16:01:33.699699 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="proxy-httpd" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.699713 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="proxy-httpd" Mar 18 16:01:33 crc kubenswrapper[4792]: E0318 16:01:33.699729 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-notification-agent" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.699735 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-notification-agent" Mar 18 16:01:33 crc kubenswrapper[4792]: E0318 16:01:33.699768 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="sg-core" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.699775 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="sg-core" Mar 18 16:01:33 crc kubenswrapper[4792]: E0318 16:01:33.699789 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-central-agent" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.699796 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-central-agent" Mar 18 16:01:33 crc kubenswrapper[4792]: E0318 16:01:33.699816 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1372258e-0814-4692-9694-7dc28a519871" containerName="dnsmasq-dns" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.699823 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1372258e-0814-4692-9694-7dc28a519871" containerName="dnsmasq-dns" Mar 18 16:01:33 crc kubenswrapper[4792]: E0318 16:01:33.699840 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1372258e-0814-4692-9694-7dc28a519871" containerName="init" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.699846 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1372258e-0814-4692-9694-7dc28a519871" containerName="init" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.700118 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-central-agent" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.700137 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="ceilometer-notification-agent" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.700150 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1372258e-0814-4692-9694-7dc28a519871" containerName="dnsmasq-dns" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.700162 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="sg-core" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.700175 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" containerName="proxy-httpd" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.702884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.705924 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.706245 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.720204 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.743490 4792 scope.go:117] "RemoveContainer" containerID="655b2a3a9cdcc64ea17a85de2ecab35ce520652eb2f54bb37619fa4cf8248a7c" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.852588 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-run-httpd\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.852652 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-log-httpd\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.852809 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.852841 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-scripts\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.852904 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.852940 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-config-data\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.853030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghjrr\" (UniqueName: \"kubernetes.io/projected/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-kube-api-access-ghjrr\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.865363 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:01:33 crc kubenswrapper[4792]: E0318 16:01:33.865748 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.875280 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d10a886-f37c-4f20-ae2a-870f8e890205" path="/var/lib/kubelet/pods/2d10a886-f37c-4f20-ae2a-870f8e890205/volumes" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.955621 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-run-httpd\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.955680 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-log-httpd\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.955863 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.955929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-scripts\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.956022 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.956063 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-config-data\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.956128 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghjrr\" (UniqueName: \"kubernetes.io/projected/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-kube-api-access-ghjrr\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.958345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-run-httpd\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.958656 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-log-httpd\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.966257 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-scripts\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.967376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.968235 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.971485 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-config-data\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:33 crc kubenswrapper[4792]: I0318 16:01:33.975429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghjrr\" (UniqueName: \"kubernetes.io/projected/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-kube-api-access-ghjrr\") pod \"ceilometer-0\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " pod="openstack/ceilometer-0" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.053505 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:01:34 crc kubenswrapper[4792]: E0318 16:01:34.185249 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.242825 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.365816 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-99vlb"] Mar 18 16:01:34 crc kubenswrapper[4792]: E0318 16:01:34.366597 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a77d6c0-2e1b-4b20-a323-8991335efabc" containerName="nova-manage" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.366621 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a77d6c0-2e1b-4b20-a323-8991335efabc" containerName="nova-manage" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.366846 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a77d6c0-2e1b-4b20-a323-8991335efabc" containerName="nova-manage" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.368848 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.370116 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-config-data\") pod \"6a77d6c0-2e1b-4b20-a323-8991335efabc\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.370209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-combined-ca-bundle\") pod \"6a77d6c0-2e1b-4b20-a323-8991335efabc\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.370419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbgbh\" (UniqueName: \"kubernetes.io/projected/6a77d6c0-2e1b-4b20-a323-8991335efabc-kube-api-access-vbgbh\") pod \"6a77d6c0-2e1b-4b20-a323-8991335efabc\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.370507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-scripts\") pod \"6a77d6c0-2e1b-4b20-a323-8991335efabc\" (UID: \"6a77d6c0-2e1b-4b20-a323-8991335efabc\") " Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.377273 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-scripts" (OuterVolumeSpecName: "scripts") pod "6a77d6c0-2e1b-4b20-a323-8991335efabc" (UID: "6a77d6c0-2e1b-4b20-a323-8991335efabc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.378237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a77d6c0-2e1b-4b20-a323-8991335efabc-kube-api-access-vbgbh" (OuterVolumeSpecName: "kube-api-access-vbgbh") pod "6a77d6c0-2e1b-4b20-a323-8991335efabc" (UID: "6a77d6c0-2e1b-4b20-a323-8991335efabc"). InnerVolumeSpecName "kube-api-access-vbgbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.408013 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99vlb"] Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.414340 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a77d6c0-2e1b-4b20-a323-8991335efabc" (UID: "6a77d6c0-2e1b-4b20-a323-8991335efabc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.418462 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-config-data" (OuterVolumeSpecName: "config-data") pod "6a77d6c0-2e1b-4b20-a323-8991335efabc" (UID: "6a77d6c0-2e1b-4b20-a323-8991335efabc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.475497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-utilities\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.475604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-catalog-content\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.476736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjjq\" (UniqueName: \"kubernetes.io/projected/7679bc06-61d9-47b3-a2ae-4177aca7f46d-kube-api-access-9hjjq\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.477294 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.477332 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbgbh\" (UniqueName: \"kubernetes.io/projected/6a77d6c0-2e1b-4b20-a323-8991335efabc-kube-api-access-vbgbh\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.477376 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.477392 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a77d6c0-2e1b-4b20-a323-8991335efabc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.579580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-utilities\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.579641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-catalog-content\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.579784 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjjq\" (UniqueName: \"kubernetes.io/projected/7679bc06-61d9-47b3-a2ae-4177aca7f46d-kube-api-access-9hjjq\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.580545 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-utilities\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.580744 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-catalog-content\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.599117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjjq\" (UniqueName: \"kubernetes.io/projected/7679bc06-61d9-47b3-a2ae-4177aca7f46d-kube-api-access-9hjjq\") pod \"community-operators-99vlb\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.606740 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlvbw" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.609266 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlvbw" event={"ID":"6a77d6c0-2e1b-4b20-a323-8991335efabc","Type":"ContainerDied","Data":"56c938f5daf1b9a0d7ef482fb94c2faacaa93f112fd5536eb081685f1dbcca74"} Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.609306 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c938f5daf1b9a0d7ef482fb94c2faacaa93f112fd5536eb081685f1dbcca74" Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.620291 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.763049 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.763329 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d9a22d88-19a8-4567-ba96-b3d2de7d9553" containerName="nova-scheduler-scheduler" containerID="cri-o://a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f" gracePeriod=30 Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.783450 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.783703 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-log" containerID="cri-o://d65c940eb809ba3c56bd4052caae015e727cc134cb3502a23bcd3f2fc02eb84a" gracePeriod=30 Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.784245 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-api" containerID="cri-o://794e7cb78223401e500978468d4c8684ae038b683f3d0d00f44838f6eef4e214" gracePeriod=30 Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.800185 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.800459 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-log" containerID="cri-o://ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155" gracePeriod=30 Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.801152 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-metadata" containerID="cri-o://213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89" gracePeriod=30 Mar 18 16:01:34 crc kubenswrapper[4792]: I0318 16:01:34.821530 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.505234 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99vlb"] Mar 18 16:01:35 crc kubenswrapper[4792]: W0318 16:01:35.531020 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7679bc06_61d9_47b3_a2ae_4177aca7f46d.slice/crio-cd776b86d4126b0089cb3aed74c449a55a238c026d0ae9a046258676a2609e22 WatchSource:0}: Error finding container cd776b86d4126b0089cb3aed74c449a55a238c026d0ae9a046258676a2609e22: Status 404 returned error can't find the container with id cd776b86d4126b0089cb3aed74c449a55a238c026d0ae9a046258676a2609e22 Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.644824 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerStarted","Data":"1b83e3e4ff459559f8cb1164013e6efa50412db5ffd9b614525693d4e5083300"} Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.645396 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerStarted","Data":"81e6b7f34ab051ce05653e995b5f6f441931e5d2f95626202b4084be50fd761e"} Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.666127 4792 generic.go:334] "Generic (PLEG): container finished" podID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerID="d65c940eb809ba3c56bd4052caae015e727cc134cb3502a23bcd3f2fc02eb84a" exitCode=143 Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.666203 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de841d5a-8f14-4578-b8d7-e132eebadc2c","Type":"ContainerDied","Data":"d65c940eb809ba3c56bd4052caae015e727cc134cb3502a23bcd3f2fc02eb84a"} Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.668170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99vlb" event={"ID":"7679bc06-61d9-47b3-a2ae-4177aca7f46d","Type":"ContainerStarted","Data":"cd776b86d4126b0089cb3aed74c449a55a238c026d0ae9a046258676a2609e22"} Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.671570 4792 generic.go:334] "Generic (PLEG): container finished" podID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerID="ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155" exitCode=143 Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.671628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2bff767f-8a15-46ae-bec8-28c8d641847b","Type":"ContainerDied","Data":"ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155"} Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.678095 4792 generic.go:334] "Generic (PLEG): container finished" podID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerID="2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d" exitCode=0 Mar 18 16:01:35 crc kubenswrapper[4792]: I0318 16:01:35.678126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerDied","Data":"2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d"} Mar 18 16:01:36 crc kubenswrapper[4792]: I0318 16:01:36.696585 4792 generic.go:334] "Generic (PLEG): container finished" podID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerID="f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346" exitCode=0 Mar 18 16:01:36 crc kubenswrapper[4792]: I0318 16:01:36.697490 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99vlb" event={"ID":"7679bc06-61d9-47b3-a2ae-4177aca7f46d","Type":"ContainerDied","Data":"f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346"} Mar 18 16:01:36 crc kubenswrapper[4792]: I0318 16:01:36.707749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerStarted","Data":"6d55978b902f66200b8c4a43c7e7206443af90a367cdf050790e3f9dd1019152"} Mar 18 16:01:37 crc kubenswrapper[4792]: I0318 16:01:37.732276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerStarted","Data":"63adba061dac334662a4d1e14c3584be5c0d8d09a223606d7a5d44d229cb80bf"} Mar 18 16:01:38 crc kubenswrapper[4792]: E0318 16:01:38.423241 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:01:38 crc kubenswrapper[4792]: E0318 16:01:38.433320 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:01:38 crc kubenswrapper[4792]: E0318 16:01:38.434920 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 16:01:38 crc kubenswrapper[4792]: E0318 16:01:38.435010 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d9a22d88-19a8-4567-ba96-b3d2de7d9553" containerName="nova-scheduler-scheduler" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.563345 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.673297 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.673363 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.697027 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-combined-ca-bundle\") pod \"2bff767f-8a15-46ae-bec8-28c8d641847b\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.697171 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rgwb\" (UniqueName: \"kubernetes.io/projected/2bff767f-8a15-46ae-bec8-28c8d641847b-kube-api-access-6rgwb\") pod \"2bff767f-8a15-46ae-bec8-28c8d641847b\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.697209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bff767f-8a15-46ae-bec8-28c8d641847b-logs\") pod \"2bff767f-8a15-46ae-bec8-28c8d641847b\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.697256 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-config-data\") pod \"2bff767f-8a15-46ae-bec8-28c8d641847b\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.697466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-nova-metadata-tls-certs\") pod \"2bff767f-8a15-46ae-bec8-28c8d641847b\" (UID: \"2bff767f-8a15-46ae-bec8-28c8d641847b\") " Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.705213 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bff767f-8a15-46ae-bec8-28c8d641847b-kube-api-access-6rgwb" (OuterVolumeSpecName: "kube-api-access-6rgwb") pod "2bff767f-8a15-46ae-bec8-28c8d641847b" (UID: "2bff767f-8a15-46ae-bec8-28c8d641847b"). InnerVolumeSpecName "kube-api-access-6rgwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.736263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bff767f-8a15-46ae-bec8-28c8d641847b-logs" (OuterVolumeSpecName: "logs") pod "2bff767f-8a15-46ae-bec8-28c8d641847b" (UID: "2bff767f-8a15-46ae-bec8-28c8d641847b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.759539 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bff767f-8a15-46ae-bec8-28c8d641847b" (UID: "2bff767f-8a15-46ae-bec8-28c8d641847b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.759592 4792 generic.go:334] "Generic (PLEG): container finished" podID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerID="794e7cb78223401e500978468d4c8684ae038b683f3d0d00f44838f6eef4e214" exitCode=0 Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.759659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de841d5a-8f14-4578-b8d7-e132eebadc2c","Type":"ContainerDied","Data":"794e7cb78223401e500978468d4c8684ae038b683f3d0d00f44838f6eef4e214"} Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.762594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99vlb" event={"ID":"7679bc06-61d9-47b3-a2ae-4177aca7f46d","Type":"ContainerStarted","Data":"21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b"} Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.771701 4792 generic.go:334] "Generic (PLEG): container finished" podID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerID="213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89" exitCode=0 Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.771791 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2bff767f-8a15-46ae-bec8-28c8d641847b","Type":"ContainerDied","Data":"213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89"} Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.771833 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2bff767f-8a15-46ae-bec8-28c8d641847b","Type":"ContainerDied","Data":"a982166703556f9443e61f6bf32b7980f4c3b998cdba22829ff5ced6cdba4dd1"} Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.771849 4792 scope.go:117] "RemoveContainer" containerID="213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.772029 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.774813 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-config-data" (OuterVolumeSpecName: "config-data") pod "2bff767f-8a15-46ae-bec8-28c8d641847b" (UID: "2bff767f-8a15-46ae-bec8-28c8d641847b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.793237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2bff767f-8a15-46ae-bec8-28c8d641847b" (UID: "2bff767f-8a15-46ae-bec8-28c8d641847b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.800571 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.800609 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.800619 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rgwb\" (UniqueName: \"kubernetes.io/projected/2bff767f-8a15-46ae-bec8-28c8d641847b-kube-api-access-6rgwb\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.800629 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bff767f-8a15-46ae-bec8-28c8d641847b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.800640 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bff767f-8a15-46ae-bec8-28c8d641847b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.923340 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.934577 4792 scope.go:117] "RemoveContainer" containerID="ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.969832 4792 scope.go:117] "RemoveContainer" containerID="213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89" Mar 18 16:01:38 crc kubenswrapper[4792]: E0318 16:01:38.980961 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89\": container with ID starting with 213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89 not found: ID does not exist" containerID="213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.981035 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89"} err="failed to get container status \"213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89\": rpc error: code = NotFound desc = could not find container \"213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89\": container with ID starting with 213b10b0626ba6b1f20f0d70cb29f72513c7ef2b6f47af3cebd0ea7f2341bc89 not found: ID does not exist" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.981074 4792 scope.go:117] "RemoveContainer" containerID="ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155" Mar 18 16:01:38 crc kubenswrapper[4792]: E0318 16:01:38.981538 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155\": container with ID starting with ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155 not found: ID does not exist" containerID="ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155" Mar 18 16:01:38 crc kubenswrapper[4792]: I0318 16:01:38.981571 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155"} err="failed to get container status \"ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155\": rpc error: code = NotFound desc = could not find container \"ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155\": container with ID starting with ac7fa6128043750d254257bb1754349874ce156b4c4cef86d5246d6d6dad0155 not found: ID does not exist" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.108729 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-internal-tls-certs\") pod \"de841d5a-8f14-4578-b8d7-e132eebadc2c\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.115957 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-config-data\") pod \"de841d5a-8f14-4578-b8d7-e132eebadc2c\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.116564 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-combined-ca-bundle\") pod \"de841d5a-8f14-4578-b8d7-e132eebadc2c\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.116988 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de841d5a-8f14-4578-b8d7-e132eebadc2c-logs\") pod \"de841d5a-8f14-4578-b8d7-e132eebadc2c\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.117117 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-public-tls-certs\") pod \"de841d5a-8f14-4578-b8d7-e132eebadc2c\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.117285 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d6wr\" (UniqueName: \"kubernetes.io/projected/de841d5a-8f14-4578-b8d7-e132eebadc2c-kube-api-access-7d6wr\") pod \"de841d5a-8f14-4578-b8d7-e132eebadc2c\" (UID: \"de841d5a-8f14-4578-b8d7-e132eebadc2c\") " Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.118160 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de841d5a-8f14-4578-b8d7-e132eebadc2c-logs" (OuterVolumeSpecName: "logs") pod "de841d5a-8f14-4578-b8d7-e132eebadc2c" (UID: "de841d5a-8f14-4578-b8d7-e132eebadc2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.122928 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de841d5a-8f14-4578-b8d7-e132eebadc2c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.140829 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:01:39 crc kubenswrapper[4792]: E0318 16:01:39.143237 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bff767f_8a15_46ae_bec8_28c8d641847b.slice/crio-a982166703556f9443e61f6bf32b7980f4c3b998cdba22829ff5ced6cdba4dd1\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.167231 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de841d5a-8f14-4578-b8d7-e132eebadc2c-kube-api-access-7d6wr" (OuterVolumeSpecName: "kube-api-access-7d6wr") pod "de841d5a-8f14-4578-b8d7-e132eebadc2c" (UID: "de841d5a-8f14-4578-b8d7-e132eebadc2c"). InnerVolumeSpecName "kube-api-access-7d6wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.180135 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.188461 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de841d5a-8f14-4578-b8d7-e132eebadc2c" (UID: "de841d5a-8f14-4578-b8d7-e132eebadc2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.200453 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:01:39 crc kubenswrapper[4792]: E0318 16:01:39.205199 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-api" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205244 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-api" Mar 18 16:01:39 crc kubenswrapper[4792]: E0318 16:01:39.205295 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-metadata" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205304 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-metadata" Mar 18 16:01:39 crc kubenswrapper[4792]: E0318 16:01:39.205320 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-log" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205329 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-log" Mar 18 16:01:39 crc kubenswrapper[4792]: E0318 16:01:39.205359 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-log" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205367 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-log" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205807 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-log" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205838 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" containerName="nova-api-api" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205860 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-log" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.205872 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" containerName="nova-metadata-metadata" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.207540 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.210795 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.210907 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.227706 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.228437 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d6wr\" (UniqueName: \"kubernetes.io/projected/de841d5a-8f14-4578-b8d7-e132eebadc2c-kube-api-access-7d6wr\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.228603 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.233657 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-config-data" (OuterVolumeSpecName: "config-data") pod "de841d5a-8f14-4578-b8d7-e132eebadc2c" (UID: "de841d5a-8f14-4578-b8d7-e132eebadc2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.263152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "de841d5a-8f14-4578-b8d7-e132eebadc2c" (UID: "de841d5a-8f14-4578-b8d7-e132eebadc2c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.275284 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "de841d5a-8f14-4578-b8d7-e132eebadc2c" (UID: "de841d5a-8f14-4578-b8d7-e132eebadc2c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.330583 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clf62\" (UniqueName: \"kubernetes.io/projected/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-kube-api-access-clf62\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.331228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-logs\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.331450 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.331497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-config-data\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.331533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.331690 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.331703 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.331714 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de841d5a-8f14-4578-b8d7-e132eebadc2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.435105 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-logs\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.436226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.436262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-config-data\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.436288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.436340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clf62\" (UniqueName: \"kubernetes.io/projected/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-kube-api-access-clf62\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.436962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-logs\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.447718 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.455683 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-config-data\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.456126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.461368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clf62\" (UniqueName: \"kubernetes.io/projected/edb37f7a-3e7f-42f6-8f05-f89ea71a1f02-kube-api-access-clf62\") pod \"nova-metadata-0\" (UID: \"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02\") " pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.540639 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.798449 4792 generic.go:334] "Generic (PLEG): container finished" podID="d9a22d88-19a8-4567-ba96-b3d2de7d9553" containerID="a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f" exitCode=0 Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.798542 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a22d88-19a8-4567-ba96-b3d2de7d9553","Type":"ContainerDied","Data":"a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f"} Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.801118 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.801240 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de841d5a-8f14-4578-b8d7-e132eebadc2c","Type":"ContainerDied","Data":"a383e0634620955f0596876f4f7bf78276f21db4bf3c578072f149d6e9a2b6ad"} Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.801311 4792 scope.go:117] "RemoveContainer" containerID="794e7cb78223401e500978468d4c8684ae038b683f3d0d00f44838f6eef4e214" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.814555 4792 generic.go:334] "Generic (PLEG): container finished" podID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerID="21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b" exitCode=0 Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.814643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99vlb" event={"ID":"7679bc06-61d9-47b3-a2ae-4177aca7f46d","Type":"ContainerDied","Data":"21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b"} Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.857754 4792 scope.go:117] "RemoveContainer" containerID="d65c940eb809ba3c56bd4052caae015e727cc134cb3502a23bcd3f2fc02eb84a" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.895487 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bff767f-8a15-46ae-bec8-28c8d641847b" path="/var/lib/kubelet/pods/2bff767f-8a15-46ae-bec8-28c8d641847b/volumes" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.932879 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.941472 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.969201 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.984357 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:39 crc kubenswrapper[4792]: E0318 16:01:39.985036 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a22d88-19a8-4567-ba96-b3d2de7d9553" containerName="nova-scheduler-scheduler" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.985051 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a22d88-19a8-4567-ba96-b3d2de7d9553" containerName="nova-scheduler-scheduler" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.985278 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a22d88-19a8-4567-ba96-b3d2de7d9553" containerName="nova-scheduler-scheduler" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.986569 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.993021 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.994593 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.995627 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 16:01:39 crc kubenswrapper[4792]: I0318 16:01:39.997177 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.064209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6fs2\" (UniqueName: \"kubernetes.io/projected/d9a22d88-19a8-4567-ba96-b3d2de7d9553-kube-api-access-p6fs2\") pod \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.064340 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-combined-ca-bundle\") pod \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.064624 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-config-data\") pod \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\" (UID: \"d9a22d88-19a8-4567-ba96-b3d2de7d9553\") " Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.071189 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a22d88-19a8-4567-ba96-b3d2de7d9553-kube-api-access-p6fs2" (OuterVolumeSpecName: "kube-api-access-p6fs2") pod "d9a22d88-19a8-4567-ba96-b3d2de7d9553" (UID: "d9a22d88-19a8-4567-ba96-b3d2de7d9553"). InnerVolumeSpecName "kube-api-access-p6fs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.100998 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-config-data" (OuterVolumeSpecName: "config-data") pod "d9a22d88-19a8-4567-ba96-b3d2de7d9553" (UID: "d9a22d88-19a8-4567-ba96-b3d2de7d9553"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.103176 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9a22d88-19a8-4567-ba96-b3d2de7d9553" (UID: "d9a22d88-19a8-4567-ba96-b3d2de7d9553"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.168649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.168747 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-public-tls-certs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.168850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-logs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.168924 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-config-data\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.169109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.169416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rvh\" (UniqueName: \"kubernetes.io/projected/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-kube-api-access-s4rvh\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.169631 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.169649 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6fs2\" (UniqueName: \"kubernetes.io/projected/d9a22d88-19a8-4567-ba96-b3d2de7d9553-kube-api-access-p6fs2\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.169665 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a22d88-19a8-4567-ba96-b3d2de7d9553-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.176851 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 16:01:40 crc kubenswrapper[4792]: W0318 16:01:40.177776 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedb37f7a_3e7f_42f6_8f05_f89ea71a1f02.slice/crio-9e46417dc39893339c9d8f446465111ebebba95e6ea94f43c2239f4470baad97 WatchSource:0}: Error finding container 9e46417dc39893339c9d8f446465111ebebba95e6ea94f43c2239f4470baad97: Status 404 returned error can't find the container with id 9e46417dc39893339c9d8f446465111ebebba95e6ea94f43c2239f4470baad97 Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.273374 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rvh\" (UniqueName: \"kubernetes.io/projected/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-kube-api-access-s4rvh\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.273510 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.274336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-public-tls-certs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.274474 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-logs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.274599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-config-data\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.274683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.275707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-logs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.278671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.279182 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.280445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-public-tls-certs\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.280994 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-config-data\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.300460 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rvh\" (UniqueName: \"kubernetes.io/projected/d5b6299c-8e67-4a48-8dd3-ef558e0f7b23-kube-api-access-s4rvh\") pod \"nova-api-0\" (UID: \"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23\") " pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.322757 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.835908 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerStarted","Data":"e3c0df205c3858808c62ee66771b1014bf467098c84899489157bbfd42dffbf7"} Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.836656 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.839032 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.839076 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a22d88-19a8-4567-ba96-b3d2de7d9553","Type":"ContainerDied","Data":"7d9987e8130a1cdab6f0dacbe40a949eceeb87cd8cd10200d67fa6a0d74b48be"} Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.839130 4792 scope.go:117] "RemoveContainer" containerID="a26a16825f0650fa6c9f8748ab0328fc5e3447d9b1ab8f08a28cbd04dc587e8f" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.842112 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02","Type":"ContainerStarted","Data":"7c7d34a4061a8678efbdca588f2807db3de924b234e21844e708596e9945325c"} Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.842158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02","Type":"ContainerStarted","Data":"088899cdce4b88d3ca7b052cc86c0edb6708f7a6223d9be839213a819a36a65c"} Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.842193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"edb37f7a-3e7f-42f6-8f05-f89ea71a1f02","Type":"ContainerStarted","Data":"9e46417dc39893339c9d8f446465111ebebba95e6ea94f43c2239f4470baad97"} Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.853349 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99vlb" event={"ID":"7679bc06-61d9-47b3-a2ae-4177aca7f46d","Type":"ContainerStarted","Data":"28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa"} Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.878948 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.881142 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.1756443069999998 podStartE2EDuration="7.881125701s" podCreationTimestamp="2026-03-18 16:01:33 +0000 UTC" firstStartedPulling="2026-03-18 16:01:34.612660359 +0000 UTC m=+1643.481989286" lastFinishedPulling="2026-03-18 16:01:39.318141743 +0000 UTC m=+1648.187470680" observedRunningTime="2026-03-18 16:01:40.869043467 +0000 UTC m=+1649.738372414" watchObservedRunningTime="2026-03-18 16:01:40.881125701 +0000 UTC m=+1649.750454638" Mar 18 16:01:40 crc kubenswrapper[4792]: W0318 16:01:40.885224 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5b6299c_8e67_4a48_8dd3_ef558e0f7b23.slice/crio-4b45ec8e0afb2f1b1ebca9ad7c62410d6421bf3adff1f287aa8669b2efeb1a9f WatchSource:0}: Error finding container 4b45ec8e0afb2f1b1ebca9ad7c62410d6421bf3adff1f287aa8669b2efeb1a9f: Status 404 returned error can't find the container with id 4b45ec8e0afb2f1b1ebca9ad7c62410d6421bf3adff1f287aa8669b2efeb1a9f Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.907301 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.90728498 podStartE2EDuration="1.90728498s" podCreationTimestamp="2026-03-18 16:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:40.8901358 +0000 UTC m=+1649.759464737" watchObservedRunningTime="2026-03-18 16:01:40.90728498 +0000 UTC m=+1649.776613917" Mar 18 16:01:40 crc kubenswrapper[4792]: I0318 16:01:40.924266 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-99vlb" podStartSLOduration=3.232667485 podStartE2EDuration="6.924245395s" podCreationTimestamp="2026-03-18 16:01:34 +0000 UTC" firstStartedPulling="2026-03-18 16:01:36.704768425 +0000 UTC m=+1645.574097362" lastFinishedPulling="2026-03-18 16:01:40.396346335 +0000 UTC m=+1649.265675272" observedRunningTime="2026-03-18 16:01:40.922518591 +0000 UTC m=+1649.791847558" watchObservedRunningTime="2026-03-18 16:01:40.924245395 +0000 UTC m=+1649.793574332" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.012711 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.038505 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.051658 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.053872 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.058576 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.062559 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.217492 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-config-data\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.217824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.218036 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjj6k\" (UniqueName: \"kubernetes.io/projected/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-kube-api-access-hjj6k\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.321150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-config-data\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.321212 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.321375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjj6k\" (UniqueName: \"kubernetes.io/projected/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-kube-api-access-hjj6k\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.324836 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-config-data\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.329858 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.339416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjj6k\" (UniqueName: \"kubernetes.io/projected/f63e8923-c2dd-459a-8019-ae9fcdbe6f92-kube-api-access-hjj6k\") pod \"nova-scheduler-0\" (UID: \"f63e8923-c2dd-459a-8019-ae9fcdbe6f92\") " pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.390096 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.874172 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a22d88-19a8-4567-ba96-b3d2de7d9553" path="/var/lib/kubelet/pods/d9a22d88-19a8-4567-ba96-b3d2de7d9553/volumes" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.875221 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de841d5a-8f14-4578-b8d7-e132eebadc2c" path="/var/lib/kubelet/pods/de841d5a-8f14-4578-b8d7-e132eebadc2c/volumes" Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.878058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23","Type":"ContainerStarted","Data":"d2e477e8c87134b015ca34cb90ee5d5c36f46ec70a6a172b659b24a91adf3158"} Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.878107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23","Type":"ContainerStarted","Data":"d0a5dbf66cea1a17e068d6412d1c4a6bebff4f130d88947f86b6566797554e61"} Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.878121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d5b6299c-8e67-4a48-8dd3-ef558e0f7b23","Type":"ContainerStarted","Data":"4b45ec8e0afb2f1b1ebca9ad7c62410d6421bf3adff1f287aa8669b2efeb1a9f"} Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.934026 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 16:01:41 crc kubenswrapper[4792]: W0318 16:01:41.939218 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf63e8923_c2dd_459a_8019_ae9fcdbe6f92.slice/crio-dbc3cf7fa5c3d7f2bcada87cd5798f0f255542a74c5e9e288bdf7729e2cc4a73 WatchSource:0}: Error finding container dbc3cf7fa5c3d7f2bcada87cd5798f0f255542a74c5e9e288bdf7729e2cc4a73: Status 404 returned error can't find the container with id dbc3cf7fa5c3d7f2bcada87cd5798f0f255542a74c5e9e288bdf7729e2cc4a73 Mar 18 16:01:41 crc kubenswrapper[4792]: I0318 16:01:41.949493 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.949473317 podStartE2EDuration="2.949473317s" podCreationTimestamp="2026-03-18 16:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:41.939622023 +0000 UTC m=+1650.808950960" watchObservedRunningTime="2026-03-18 16:01:41.949473317 +0000 UTC m=+1650.818802254" Mar 18 16:01:42 crc kubenswrapper[4792]: I0318 16:01:42.903988 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f63e8923-c2dd-459a-8019-ae9fcdbe6f92","Type":"ContainerStarted","Data":"325d791ff528cab9d1633ba5816ca1c16f3438da2011e8099a51cb081248cb20"} Mar 18 16:01:42 crc kubenswrapper[4792]: I0318 16:01:42.905036 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f63e8923-c2dd-459a-8019-ae9fcdbe6f92","Type":"ContainerStarted","Data":"dbc3cf7fa5c3d7f2bcada87cd5798f0f255542a74c5e9e288bdf7729e2cc4a73"} Mar 18 16:01:42 crc kubenswrapper[4792]: I0318 16:01:42.926074 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.926052978 podStartE2EDuration="2.926052978s" podCreationTimestamp="2026-03-18 16:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:01:42.9219205 +0000 UTC m=+1651.791249437" watchObservedRunningTime="2026-03-18 16:01:42.926052978 +0000 UTC m=+1651.795381915" Mar 18 16:01:44 crc kubenswrapper[4792]: E0318 16:01:44.530396 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:44 crc kubenswrapper[4792]: I0318 16:01:44.821869 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:44 crc kubenswrapper[4792]: I0318 16:01:44.822157 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:44 crc kubenswrapper[4792]: I0318 16:01:44.881230 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:46 crc kubenswrapper[4792]: I0318 16:01:46.391031 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 16:01:48 crc kubenswrapper[4792]: E0318 16:01:48.104528 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:48 crc kubenswrapper[4792]: E0318 16:01:48.104775 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:48 crc kubenswrapper[4792]: I0318 16:01:48.854474 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:01:48 crc kubenswrapper[4792]: E0318 16:01:48.855199 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:01:49 crc kubenswrapper[4792]: I0318 16:01:49.541951 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:01:49 crc kubenswrapper[4792]: I0318 16:01:49.542020 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 16:01:50 crc kubenswrapper[4792]: I0318 16:01:50.323777 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:50 crc kubenswrapper[4792]: I0318 16:01:50.324220 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 16:01:50 crc kubenswrapper[4792]: I0318 16:01:50.556494 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="edb37f7a-3e7f-42f6-8f05-f89ea71a1f02" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.17:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:50 crc kubenswrapper[4792]: I0318 16:01:50.556843 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="edb37f7a-3e7f-42f6-8f05-f89ea71a1f02" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.17:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:51 crc kubenswrapper[4792]: I0318 16:01:51.337148 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d5b6299c-8e67-4a48-8dd3-ef558e0f7b23" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.18:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:51 crc kubenswrapper[4792]: I0318 16:01:51.337209 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d5b6299c-8e67-4a48-8dd3-ef558e0f7b23" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.18:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:01:51 crc kubenswrapper[4792]: I0318 16:01:51.391215 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 16:01:51 crc kubenswrapper[4792]: I0318 16:01:51.426129 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 16:01:52 crc kubenswrapper[4792]: I0318 16:01:52.058858 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 16:01:54 crc kubenswrapper[4792]: E0318 16:01:54.186928 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:54 crc kubenswrapper[4792]: E0318 16:01:54.581187 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache]" Mar 18 16:01:54 crc kubenswrapper[4792]: I0318 16:01:54.881814 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:54 crc kubenswrapper[4792]: I0318 16:01:54.938415 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-99vlb"] Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.044051 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-99vlb" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="registry-server" containerID="cri-o://28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa" gracePeriod=2 Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.641778 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.781203 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-utilities\") pod \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.781623 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-catalog-content\") pod \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.781714 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hjjq\" (UniqueName: \"kubernetes.io/projected/7679bc06-61d9-47b3-a2ae-4177aca7f46d-kube-api-access-9hjjq\") pod \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\" (UID: \"7679bc06-61d9-47b3-a2ae-4177aca7f46d\") " Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.781761 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-utilities" (OuterVolumeSpecName: "utilities") pod "7679bc06-61d9-47b3-a2ae-4177aca7f46d" (UID: "7679bc06-61d9-47b3-a2ae-4177aca7f46d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.782492 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.805755 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7679bc06-61d9-47b3-a2ae-4177aca7f46d-kube-api-access-9hjjq" (OuterVolumeSpecName: "kube-api-access-9hjjq") pod "7679bc06-61d9-47b3-a2ae-4177aca7f46d" (UID: "7679bc06-61d9-47b3-a2ae-4177aca7f46d"). InnerVolumeSpecName "kube-api-access-9hjjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.855637 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7679bc06-61d9-47b3-a2ae-4177aca7f46d" (UID: "7679bc06-61d9-47b3-a2ae-4177aca7f46d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.885209 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hjjq\" (UniqueName: \"kubernetes.io/projected/7679bc06-61d9-47b3-a2ae-4177aca7f46d-kube-api-access-9hjjq\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:55 crc kubenswrapper[4792]: I0318 16:01:55.885246 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7679bc06-61d9-47b3-a2ae-4177aca7f46d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.067302 4792 generic.go:334] "Generic (PLEG): container finished" podID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerID="28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa" exitCode=0 Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.067350 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99vlb" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.067379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99vlb" event={"ID":"7679bc06-61d9-47b3-a2ae-4177aca7f46d","Type":"ContainerDied","Data":"28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa"} Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.067913 4792 scope.go:117] "RemoveContainer" containerID="28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.069593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99vlb" event={"ID":"7679bc06-61d9-47b3-a2ae-4177aca7f46d","Type":"ContainerDied","Data":"cd776b86d4126b0089cb3aed74c449a55a238c026d0ae9a046258676a2609e22"} Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.097283 4792 scope.go:117] "RemoveContainer" containerID="21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.101526 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-99vlb"] Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.118349 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-99vlb"] Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.124787 4792 scope.go:117] "RemoveContainer" containerID="f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.174128 4792 scope.go:117] "RemoveContainer" containerID="28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa" Mar 18 16:01:56 crc kubenswrapper[4792]: E0318 16:01:56.174687 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa\": container with ID starting with 28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa not found: ID does not exist" containerID="28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.174784 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa"} err="failed to get container status \"28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa\": rpc error: code = NotFound desc = could not find container \"28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa\": container with ID starting with 28f2c094501b6842bf6f6fb36647d2ca3c0c4eb7cf8abcbd1313c25943ca4caa not found: ID does not exist" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.174865 4792 scope.go:117] "RemoveContainer" containerID="21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b" Mar 18 16:01:56 crc kubenswrapper[4792]: E0318 16:01:56.175333 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b\": container with ID starting with 21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b not found: ID does not exist" containerID="21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.175436 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b"} err="failed to get container status \"21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b\": rpc error: code = NotFound desc = could not find container \"21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b\": container with ID starting with 21fa4ac85bd0fbb4aa609d1ab6900d00aba7f1930c89d98aadd43e5433b8170b not found: ID does not exist" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.175510 4792 scope.go:117] "RemoveContainer" containerID="f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346" Mar 18 16:01:56 crc kubenswrapper[4792]: E0318 16:01:56.175859 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346\": container with ID starting with f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346 not found: ID does not exist" containerID="f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346" Mar 18 16:01:56 crc kubenswrapper[4792]: I0318 16:01:56.175889 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346"} err="failed to get container status \"f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346\": rpc error: code = NotFound desc = could not find container \"f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346\": container with ID starting with f2c3e5b3c8e6fcc3b65c1e583284a993278759ab5197341acf3cf30f4e6c1346 not found: ID does not exist" Mar 18 16:01:57 crc kubenswrapper[4792]: I0318 16:01:57.541376 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:01:57 crc kubenswrapper[4792]: I0318 16:01:57.541750 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 16:01:57 crc kubenswrapper[4792]: I0318 16:01:57.868936 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" path="/var/lib/kubelet/pods/7679bc06-61d9-47b3-a2ae-4177aca7f46d/volumes" Mar 18 16:01:58 crc kubenswrapper[4792]: I0318 16:01:58.324443 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:58 crc kubenswrapper[4792]: I0318 16:01:58.324652 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 16:01:59 crc kubenswrapper[4792]: I0318 16:01:59.547556 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:01:59 crc kubenswrapper[4792]: I0318 16:01:59.551478 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 16:01:59 crc kubenswrapper[4792]: I0318 16:01:59.557412 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.130862 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.147642 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564162-z5rgz"] Mar 18 16:02:00 crc kubenswrapper[4792]: E0318 16:02:00.148330 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="extract-utilities" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.148359 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="extract-utilities" Mar 18 16:02:00 crc kubenswrapper[4792]: E0318 16:02:00.148375 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="extract-content" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.148383 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="extract-content" Mar 18 16:02:00 crc kubenswrapper[4792]: E0318 16:02:00.148398 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="registry-server" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.148406 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="registry-server" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.148691 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7679bc06-61d9-47b3-a2ae-4177aca7f46d" containerName="registry-server" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.149793 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-z5rgz" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.153374 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.153385 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.153482 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.160711 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-z5rgz"] Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.311478 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4c9v\" (UniqueName: \"kubernetes.io/projected/a5954f4a-541b-4146-89bd-eda39e5a9664-kube-api-access-k4c9v\") pod \"auto-csr-approver-29564162-z5rgz\" (UID: \"a5954f4a-541b-4146-89bd-eda39e5a9664\") " pod="openshift-infra/auto-csr-approver-29564162-z5rgz" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.330258 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.330827 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.337775 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.414350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4c9v\" (UniqueName: \"kubernetes.io/projected/a5954f4a-541b-4146-89bd-eda39e5a9664-kube-api-access-k4c9v\") pod \"auto-csr-approver-29564162-z5rgz\" (UID: \"a5954f4a-541b-4146-89bd-eda39e5a9664\") " pod="openshift-infra/auto-csr-approver-29564162-z5rgz" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.438420 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4c9v\" (UniqueName: \"kubernetes.io/projected/a5954f4a-541b-4146-89bd-eda39e5a9664-kube-api-access-k4c9v\") pod \"auto-csr-approver-29564162-z5rgz\" (UID: \"a5954f4a-541b-4146-89bd-eda39e5a9664\") " pod="openshift-infra/auto-csr-approver-29564162-z5rgz" Mar 18 16:02:00 crc kubenswrapper[4792]: I0318 16:02:00.478498 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-z5rgz" Mar 18 16:02:01 crc kubenswrapper[4792]: I0318 16:02:01.125257 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-z5rgz"] Mar 18 16:02:01 crc kubenswrapper[4792]: W0318 16:02:01.145182 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5954f4a_541b_4146_89bd_eda39e5a9664.slice/crio-2f1ac5adb09ed80f0695f2f68ddb6db807c771fc7c902c054698a655a913136a WatchSource:0}: Error finding container 2f1ac5adb09ed80f0695f2f68ddb6db807c771fc7c902c054698a655a913136a: Status 404 returned error can't find the container with id 2f1ac5adb09ed80f0695f2f68ddb6db807c771fc7c902c054698a655a913136a Mar 18 16:02:01 crc kubenswrapper[4792]: I0318 16:02:01.152283 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 16:02:01 crc kubenswrapper[4792]: I0318 16:02:01.871751 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:02:01 crc kubenswrapper[4792]: E0318 16:02:01.874284 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.159957 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.160483 4792 generic.go:334] "Generic (PLEG): container finished" podID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerID="543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5" exitCode=137 Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.160536 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerDied","Data":"543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5"} Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.160559 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1e1d85bc-e0d7-4b04-87a1-83882d658a4a","Type":"ContainerDied","Data":"be67e25b35f59f321961c7cfeb36022b94d547d8ef4869f1f1fec9d55cdac091"} Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.160575 4792 scope.go:117] "RemoveContainer" containerID="543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.164367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-z5rgz" event={"ID":"a5954f4a-541b-4146-89bd-eda39e5a9664","Type":"ContainerStarted","Data":"2f1ac5adb09ed80f0695f2f68ddb6db807c771fc7c902c054698a655a913136a"} Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.272134 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-scripts\") pod \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.272377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-config-data\") pod \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.273253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8t7v\" (UniqueName: \"kubernetes.io/projected/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-kube-api-access-h8t7v\") pod \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.273336 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-combined-ca-bundle\") pod \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\" (UID: \"1e1d85bc-e0d7-4b04-87a1-83882d658a4a\") " Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.280202 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-kube-api-access-h8t7v" (OuterVolumeSpecName: "kube-api-access-h8t7v") pod "1e1d85bc-e0d7-4b04-87a1-83882d658a4a" (UID: "1e1d85bc-e0d7-4b04-87a1-83882d658a4a"). InnerVolumeSpecName "kube-api-access-h8t7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.280533 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-scripts" (OuterVolumeSpecName: "scripts") pod "1e1d85bc-e0d7-4b04-87a1-83882d658a4a" (UID: "1e1d85bc-e0d7-4b04-87a1-83882d658a4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.285328 4792 scope.go:117] "RemoveContainer" containerID="2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.376716 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.376749 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8t7v\" (UniqueName: \"kubernetes.io/projected/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-kube-api-access-h8t7v\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.394417 4792 scope.go:117] "RemoveContainer" containerID="712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.414526 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e1d85bc-e0d7-4b04-87a1-83882d658a4a" (UID: "1e1d85bc-e0d7-4b04-87a1-83882d658a4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.434027 4792 scope.go:117] "RemoveContainer" containerID="1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.479888 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-config-data" (OuterVolumeSpecName: "config-data") pod "1e1d85bc-e0d7-4b04-87a1-83882d658a4a" (UID: "1e1d85bc-e0d7-4b04-87a1-83882d658a4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.480357 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.480381 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1d85bc-e0d7-4b04-87a1-83882d658a4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.482295 4792 scope.go:117] "RemoveContainer" containerID="543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5" Mar 18 16:02:02 crc kubenswrapper[4792]: E0318 16:02:02.485176 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5\": container with ID starting with 543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5 not found: ID does not exist" containerID="543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.485273 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5"} err="failed to get container status \"543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5\": rpc error: code = NotFound desc = could not find container \"543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5\": container with ID starting with 543642b9d24ef44f08735815759e82c54eac067069462f1d06d7a045b27abba5 not found: ID does not exist" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.485325 4792 scope.go:117] "RemoveContainer" containerID="2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d" Mar 18 16:02:02 crc kubenswrapper[4792]: E0318 16:02:02.485857 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d\": container with ID starting with 2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d not found: ID does not exist" containerID="2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.485938 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d"} err="failed to get container status \"2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d\": rpc error: code = NotFound desc = could not find container \"2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d\": container with ID starting with 2587c25a30a6929076e4ab66cf43348dc65ad9694afc638abccaa7737fd9f47d not found: ID does not exist" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.486010 4792 scope.go:117] "RemoveContainer" containerID="712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59" Mar 18 16:02:02 crc kubenswrapper[4792]: E0318 16:02:02.487326 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59\": container with ID starting with 712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59 not found: ID does not exist" containerID="712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.487394 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59"} err="failed to get container status \"712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59\": rpc error: code = NotFound desc = could not find container \"712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59\": container with ID starting with 712d66eeb134eb8e876c92040f0a1ca908439d0c6a09c3a8651d9eef27c51d59 not found: ID does not exist" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.487427 4792 scope.go:117] "RemoveContainer" containerID="1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc" Mar 18 16:02:02 crc kubenswrapper[4792]: E0318 16:02:02.488212 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc\": container with ID starting with 1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc not found: ID does not exist" containerID="1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc" Mar 18 16:02:02 crc kubenswrapper[4792]: I0318 16:02:02.488255 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc"} err="failed to get container status \"1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc\": rpc error: code = NotFound desc = could not find container \"1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc\": container with ID starting with 1dfe3b6b6d1fb8a555dff2a941eaa95438260183aa3f0afd2024fbd6c70fb4dc not found: ID does not exist" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.185572 4792 generic.go:334] "Generic (PLEG): container finished" podID="a5954f4a-541b-4146-89bd-eda39e5a9664" containerID="99d8b233280187e8ef8e4cf628af40a883b0a835e14403c0be363cbfb929da57" exitCode=0 Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.185734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-z5rgz" event={"ID":"a5954f4a-541b-4146-89bd-eda39e5a9664","Type":"ContainerDied","Data":"99d8b233280187e8ef8e4cf628af40a883b0a835e14403c0be363cbfb929da57"} Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.190302 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.275450 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.333860 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.367065 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 16:02:03 crc kubenswrapper[4792]: E0318 16:02:03.367633 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-api" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.367656 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-api" Mar 18 16:02:03 crc kubenswrapper[4792]: E0318 16:02:03.367685 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-notifier" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.367692 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-notifier" Mar 18 16:02:03 crc kubenswrapper[4792]: E0318 16:02:03.367726 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-evaluator" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.367732 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-evaluator" Mar 18 16:02:03 crc kubenswrapper[4792]: E0318 16:02:03.367746 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-listener" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.367753 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-listener" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.368211 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-notifier" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.368230 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-api" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.368252 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-listener" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.368267 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" containerName="aodh-evaluator" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.370911 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.375162 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5q5xg" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.375524 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.375733 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.375927 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.376175 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.382994 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.434565 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-internal-tls-certs\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.434729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xtsg\" (UniqueName: \"kubernetes.io/projected/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-kube-api-access-2xtsg\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.434806 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-config-data\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.434825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-combined-ca-bundle\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.434845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-public-tls-certs\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.434920 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-scripts\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.536830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xtsg\" (UniqueName: \"kubernetes.io/projected/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-kube-api-access-2xtsg\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.536947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-config-data\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.536997 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-combined-ca-bundle\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.537030 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-public-tls-certs\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.537098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-scripts\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.537131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-internal-tls-certs\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.546709 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-combined-ca-bundle\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.547378 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-scripts\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.547808 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-config-data\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.551651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-public-tls-certs\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.560882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-internal-tls-certs\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.567318 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xtsg\" (UniqueName: \"kubernetes.io/projected/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-kube-api-access-2xtsg\") pod \"aodh-0\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.701036 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:02:03 crc kubenswrapper[4792]: I0318 16:02:03.872407 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1d85bc-e0d7-4b04-87a1-83882d658a4a" path="/var/lib/kubelet/pods/1e1d85bc-e0d7-4b04-87a1-83882d658a4a/volumes" Mar 18 16:02:04 crc kubenswrapper[4792]: I0318 16:02:04.065468 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 16:02:04 crc kubenswrapper[4792]: I0318 16:02:04.270819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 16:02:04 crc kubenswrapper[4792]: E0318 16:02:04.632556 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:02:04 crc kubenswrapper[4792]: I0318 16:02:04.731928 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-z5rgz" Mar 18 16:02:04 crc kubenswrapper[4792]: I0318 16:02:04.904541 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4c9v\" (UniqueName: \"kubernetes.io/projected/a5954f4a-541b-4146-89bd-eda39e5a9664-kube-api-access-k4c9v\") pod \"a5954f4a-541b-4146-89bd-eda39e5a9664\" (UID: \"a5954f4a-541b-4146-89bd-eda39e5a9664\") " Mar 18 16:02:04 crc kubenswrapper[4792]: I0318 16:02:04.914264 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5954f4a-541b-4146-89bd-eda39e5a9664-kube-api-access-k4c9v" (OuterVolumeSpecName: "kube-api-access-k4c9v") pod "a5954f4a-541b-4146-89bd-eda39e5a9664" (UID: "a5954f4a-541b-4146-89bd-eda39e5a9664"). InnerVolumeSpecName "kube-api-access-k4c9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.008067 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4c9v\" (UniqueName: \"kubernetes.io/projected/a5954f4a-541b-4146-89bd-eda39e5a9664-kube-api-access-k4c9v\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.222264 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564162-z5rgz" Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.222380 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564162-z5rgz" event={"ID":"a5954f4a-541b-4146-89bd-eda39e5a9664","Type":"ContainerDied","Data":"2f1ac5adb09ed80f0695f2f68ddb6db807c771fc7c902c054698a655a913136a"} Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.222409 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1ac5adb09ed80f0695f2f68ddb6db807c771fc7c902c054698a655a913136a" Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.224889 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerStarted","Data":"e4242b2a6738b067d7a1737c3e37d02b65813652b87fad1065fb6264ef95b129"} Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.825468 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-j4qrz"] Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.844991 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564156-j4qrz"] Mar 18 16:02:05 crc kubenswrapper[4792]: I0318 16:02:05.901522 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a319b20-00fe-4182-8fdb-1f71c5f4f655" path="/var/lib/kubelet/pods/7a319b20-00fe-4182-8fdb-1f71c5f4f655/volumes" Mar 18 16:02:06 crc kubenswrapper[4792]: I0318 16:02:06.284343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerStarted","Data":"e1b40dff9f04b804d952bda6ef843bcf7d9f30a280bdfe8bdd16165dfe23d28e"} Mar 18 16:02:06 crc kubenswrapper[4792]: I0318 16:02:06.284717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerStarted","Data":"339bc94021efac167fcbd3ffdb6eb353ea55450dd68041b222d8c6ff99fe4a3e"} Mar 18 16:02:07 crc kubenswrapper[4792]: I0318 16:02:07.305799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerStarted","Data":"85faedf7edd2232b6dd3eb1b897f02c14706d3a065a10c7854bc795122383dbf"} Mar 18 16:02:08 crc kubenswrapper[4792]: I0318 16:02:08.322074 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerStarted","Data":"b216462437c6ae53381b77f30bb5220e18d1576fea7d99d9aaa310eb1a905c8b"} Mar 18 16:02:08 crc kubenswrapper[4792]: I0318 16:02:08.366407 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.154928776 podStartE2EDuration="5.366383933s" podCreationTimestamp="2026-03-18 16:02:03 +0000 UTC" firstStartedPulling="2026-03-18 16:02:04.286768848 +0000 UTC m=+1673.156097785" lastFinishedPulling="2026-03-18 16:02:07.498224005 +0000 UTC m=+1676.367552942" observedRunningTime="2026-03-18 16:02:08.347273932 +0000 UTC m=+1677.216602899" watchObservedRunningTime="2026-03-18 16:02:08.366383933 +0000 UTC m=+1677.235712870" Mar 18 16:02:09 crc kubenswrapper[4792]: E0318 16:02:09.209865 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8325ad2_da8e_4a5a_b759_57a468b289b7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice/crio-0abe7f2c0c1ddb3237dcbeb15255b05f2f11acf810d82da2397169e527b5fb46\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17418767_af15_46e0_b37e_0c1d8102a2e6.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.127909 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.128505 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="018a5f60-5274-4779-912f-4d7c32b6bfe5" containerName="kube-state-metrics" containerID="cri-o://877641f65256f870166235530babf01a0e7be7e226ff72cb2f08b823a95eaeeb" gracePeriod=30 Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.226073 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.226295 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="269b9513-d2bb-4890-b0fc-11271ef50754" containerName="mysqld-exporter" containerID="cri-o://067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3" gracePeriod=30 Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.358924 4792 generic.go:334] "Generic (PLEG): container finished" podID="018a5f60-5274-4779-912f-4d7c32b6bfe5" containerID="877641f65256f870166235530babf01a0e7be7e226ff72cb2f08b823a95eaeeb" exitCode=2 Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.359006 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"018a5f60-5274-4779-912f-4d7c32b6bfe5","Type":"ContainerDied","Data":"877641f65256f870166235530babf01a0e7be7e226ff72cb2f08b823a95eaeeb"} Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.881226 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.961622 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqkc2\" (UniqueName: \"kubernetes.io/projected/018a5f60-5274-4779-912f-4d7c32b6bfe5-kube-api-access-rqkc2\") pod \"018a5f60-5274-4779-912f-4d7c32b6bfe5\" (UID: \"018a5f60-5274-4779-912f-4d7c32b6bfe5\") " Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.965315 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 16:02:10 crc kubenswrapper[4792]: I0318 16:02:10.973759 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018a5f60-5274-4779-912f-4d7c32b6bfe5-kube-api-access-rqkc2" (OuterVolumeSpecName: "kube-api-access-rqkc2") pod "018a5f60-5274-4779-912f-4d7c32b6bfe5" (UID: "018a5f60-5274-4779-912f-4d7c32b6bfe5"). InnerVolumeSpecName "kube-api-access-rqkc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.065551 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-485fc\" (UniqueName: \"kubernetes.io/projected/269b9513-d2bb-4890-b0fc-11271ef50754-kube-api-access-485fc\") pod \"269b9513-d2bb-4890-b0fc-11271ef50754\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.065613 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-combined-ca-bundle\") pod \"269b9513-d2bb-4890-b0fc-11271ef50754\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.065653 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-config-data\") pod \"269b9513-d2bb-4890-b0fc-11271ef50754\" (UID: \"269b9513-d2bb-4890-b0fc-11271ef50754\") " Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.066317 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqkc2\" (UniqueName: \"kubernetes.io/projected/018a5f60-5274-4779-912f-4d7c32b6bfe5-kube-api-access-rqkc2\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.072390 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269b9513-d2bb-4890-b0fc-11271ef50754-kube-api-access-485fc" (OuterVolumeSpecName: "kube-api-access-485fc") pod "269b9513-d2bb-4890-b0fc-11271ef50754" (UID: "269b9513-d2bb-4890-b0fc-11271ef50754"). InnerVolumeSpecName "kube-api-access-485fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.120652 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "269b9513-d2bb-4890-b0fc-11271ef50754" (UID: "269b9513-d2bb-4890-b0fc-11271ef50754"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.169283 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-485fc\" (UniqueName: \"kubernetes.io/projected/269b9513-d2bb-4890-b0fc-11271ef50754-kube-api-access-485fc\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.169324 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.169619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-config-data" (OuterVolumeSpecName: "config-data") pod "269b9513-d2bb-4890-b0fc-11271ef50754" (UID: "269b9513-d2bb-4890-b0fc-11271ef50754"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.273242 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/269b9513-d2bb-4890-b0fc-11271ef50754-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.372202 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.372210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"018a5f60-5274-4779-912f-4d7c32b6bfe5","Type":"ContainerDied","Data":"c06a89c1ad7d6139f4bd82b52d85f44eef2a1c0fce1496adc31fd75fb82d6a0a"} Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.372368 4792 scope.go:117] "RemoveContainer" containerID="877641f65256f870166235530babf01a0e7be7e226ff72cb2f08b823a95eaeeb" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.374463 4792 generic.go:334] "Generic (PLEG): container finished" podID="269b9513-d2bb-4890-b0fc-11271ef50754" containerID="067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3" exitCode=2 Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.374517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"269b9513-d2bb-4890-b0fc-11271ef50754","Type":"ContainerDied","Data":"067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3"} Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.374552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"269b9513-d2bb-4890-b0fc-11271ef50754","Type":"ContainerDied","Data":"064f4a99970a118a2284b468f6d1a27ceb8ad67362ab9902aa98ba5136664964"} Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.374611 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.444661 4792 scope.go:117] "RemoveContainer" containerID="067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.446100 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.500274 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.518189 4792 scope.go:117] "RemoveContainer" containerID="067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3" Mar 18 16:02:11 crc kubenswrapper[4792]: E0318 16:02:11.518635 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3\": container with ID starting with 067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3 not found: ID does not exist" containerID="067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.518691 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3"} err="failed to get container status \"067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3\": rpc error: code = NotFound desc = could not find container \"067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3\": container with ID starting with 067dd0463e39d50c275383f2076b28e70948f15777f2ab2d8a74463e398e6eb3 not found: ID does not exist" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.533230 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: E0318 16:02:11.533897 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018a5f60-5274-4779-912f-4d7c32b6bfe5" containerName="kube-state-metrics" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.533920 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="018a5f60-5274-4779-912f-4d7c32b6bfe5" containerName="kube-state-metrics" Mar 18 16:02:11 crc kubenswrapper[4792]: E0318 16:02:11.533956 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269b9513-d2bb-4890-b0fc-11271ef50754" containerName="mysqld-exporter" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.533994 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="269b9513-d2bb-4890-b0fc-11271ef50754" containerName="mysqld-exporter" Mar 18 16:02:11 crc kubenswrapper[4792]: E0318 16:02:11.534037 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5954f4a-541b-4146-89bd-eda39e5a9664" containerName="oc" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.534046 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5954f4a-541b-4146-89bd-eda39e5a9664" containerName="oc" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.534300 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="269b9513-d2bb-4890-b0fc-11271ef50754" containerName="mysqld-exporter" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.534330 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="018a5f60-5274-4779-912f-4d7c32b6bfe5" containerName="kube-state-metrics" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.534347 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5954f4a-541b-4146-89bd-eda39e5a9664" containerName="oc" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.536007 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.543463 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.546509 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.558550 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.574039 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.582517 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.582576 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.582618 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.582756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7h6\" (UniqueName: \"kubernetes.io/projected/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-api-access-hk7h6\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.590289 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.604951 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.607014 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.609164 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.609351 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.616430 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.685773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.686115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.686257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.686372 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-config-data\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.686886 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.687215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7h6\" (UniqueName: \"kubernetes.io/projected/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-api-access-hk7h6\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.687432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmx8\" (UniqueName: \"kubernetes.io/projected/513befbc-4cbb-472a-9770-376700a8d1bb-kube-api-access-ntmx8\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.687728 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.691617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.692115 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.692947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b458d30-1f6c-4042-989d-71e39a0aece2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.712056 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7h6\" (UniqueName: \"kubernetes.io/projected/2b458d30-1f6c-4042-989d-71e39a0aece2-kube-api-access-hk7h6\") pod \"kube-state-metrics-0\" (UID: \"2b458d30-1f6c-4042-989d-71e39a0aece2\") " pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.789716 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.789840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-config-data\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.789932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.790048 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmx8\" (UniqueName: \"kubernetes.io/projected/513befbc-4cbb-472a-9770-376700a8d1bb-kube-api-access-ntmx8\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.792133 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.792612 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.794165 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.803576 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-config-data\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.806029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/513befbc-4cbb-472a-9770-376700a8d1bb-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.810641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmx8\" (UniqueName: \"kubernetes.io/projected/513befbc-4cbb-472a-9770-376700a8d1bb-kube-api-access-ntmx8\") pod \"mysqld-exporter-0\" (UID: \"513befbc-4cbb-472a-9770-376700a8d1bb\") " pod="openstack/mysqld-exporter-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.863271 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.881723 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018a5f60-5274-4779-912f-4d7c32b6bfe5" path="/var/lib/kubelet/pods/018a5f60-5274-4779-912f-4d7c32b6bfe5/volumes" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.884446 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269b9513-d2bb-4890-b0fc-11271ef50754" path="/var/lib/kubelet/pods/269b9513-d2bb-4890-b0fc-11271ef50754/volumes" Mar 18 16:02:11 crc kubenswrapper[4792]: I0318 16:02:11.929614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 18 16:02:12 crc kubenswrapper[4792]: I0318 16:02:12.497369 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 16:02:12 crc kubenswrapper[4792]: I0318 16:02:12.634649 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 18 16:02:12 crc kubenswrapper[4792]: W0318 16:02:12.639929 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod513befbc_4cbb_472a_9770_376700a8d1bb.slice/crio-7f7d23ce882d7f9466d5927cc4bf8e814c7aedfc27c4e9083d42bb54135b7bfa WatchSource:0}: Error finding container 7f7d23ce882d7f9466d5927cc4bf8e814c7aedfc27c4e9083d42bb54135b7bfa: Status 404 returned error can't find the container with id 7f7d23ce882d7f9466d5927cc4bf8e814c7aedfc27c4e9083d42bb54135b7bfa Mar 18 16:02:12 crc kubenswrapper[4792]: I0318 16:02:12.854527 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:02:12 crc kubenswrapper[4792]: E0318 16:02:12.854890 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.306022 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.306841 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-central-agent" containerID="cri-o://1b83e3e4ff459559f8cb1164013e6efa50412db5ffd9b614525693d4e5083300" gracePeriod=30 Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.307004 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-notification-agent" containerID="cri-o://6d55978b902f66200b8c4a43c7e7206443af90a367cdf050790e3f9dd1019152" gracePeriod=30 Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.306925 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="proxy-httpd" containerID="cri-o://e3c0df205c3858808c62ee66771b1014bf467098c84899489157bbfd42dffbf7" gracePeriod=30 Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.306950 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="sg-core" containerID="cri-o://63adba061dac334662a4d1e14c3584be5c0d8d09a223606d7a5d44d229cb80bf" gracePeriod=30 Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.459401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b458d30-1f6c-4042-989d-71e39a0aece2","Type":"ContainerStarted","Data":"ff65c16f34ed44887ba997fc4c34fca8a03c84e0cea7971303d2022d7e019e70"} Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.459640 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b458d30-1f6c-4042-989d-71e39a0aece2","Type":"ContainerStarted","Data":"33f0e3d06c5bfc44edcc2bf199a78a0cf916dd01710ce6640f196372188e1a45"} Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.459763 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.461102 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"513befbc-4cbb-472a-9770-376700a8d1bb","Type":"ContainerStarted","Data":"7f7d23ce882d7f9466d5927cc4bf8e814c7aedfc27c4e9083d42bb54135b7bfa"} Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.465017 4792 generic.go:334] "Generic (PLEG): container finished" podID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerID="e3c0df205c3858808c62ee66771b1014bf467098c84899489157bbfd42dffbf7" exitCode=0 Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.465053 4792 generic.go:334] "Generic (PLEG): container finished" podID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerID="63adba061dac334662a4d1e14c3584be5c0d8d09a223606d7a5d44d229cb80bf" exitCode=2 Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.465078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerDied","Data":"e3c0df205c3858808c62ee66771b1014bf467098c84899489157bbfd42dffbf7"} Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.465104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerDied","Data":"63adba061dac334662a4d1e14c3584be5c0d8d09a223606d7a5d44d229cb80bf"} Mar 18 16:02:13 crc kubenswrapper[4792]: I0318 16:02:13.491413 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.079188764 podStartE2EDuration="2.491387767s" podCreationTimestamp="2026-03-18 16:02:11 +0000 UTC" firstStartedPulling="2026-03-18 16:02:12.50974888 +0000 UTC m=+1681.379077817" lastFinishedPulling="2026-03-18 16:02:12.921947883 +0000 UTC m=+1681.791276820" observedRunningTime="2026-03-18 16:02:13.484697459 +0000 UTC m=+1682.354026406" watchObservedRunningTime="2026-03-18 16:02:13.491387767 +0000 UTC m=+1682.360716704" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.480600 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"513befbc-4cbb-472a-9770-376700a8d1bb","Type":"ContainerStarted","Data":"029441cd374c13897d56dfefc3987c0eff93323454f6375e29bfb459fbdd27f8"} Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.485616 4792 generic.go:334] "Generic (PLEG): container finished" podID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerID="6d55978b902f66200b8c4a43c7e7206443af90a367cdf050790e3f9dd1019152" exitCode=0 Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.485683 4792 generic.go:334] "Generic (PLEG): container finished" podID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerID="1b83e3e4ff459559f8cb1164013e6efa50412db5ffd9b614525693d4e5083300" exitCode=0 Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.485680 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerDied","Data":"6d55978b902f66200b8c4a43c7e7206443af90a367cdf050790e3f9dd1019152"} Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.485717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerDied","Data":"1b83e3e4ff459559f8cb1164013e6efa50412db5ffd9b614525693d4e5083300"} Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.525601 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.9671410849999997 podStartE2EDuration="3.525577577s" podCreationTimestamp="2026-03-18 16:02:11 +0000 UTC" firstStartedPulling="2026-03-18 16:02:12.642904147 +0000 UTC m=+1681.512233084" lastFinishedPulling="2026-03-18 16:02:13.201340639 +0000 UTC m=+1682.070669576" observedRunningTime="2026-03-18 16:02:14.510577644 +0000 UTC m=+1683.379906591" watchObservedRunningTime="2026-03-18 16:02:14.525577577 +0000 UTC m=+1683.394906514" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.639621 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.681937 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-sg-core-conf-yaml\") pod \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.682186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-combined-ca-bundle\") pod \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.682375 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-log-httpd\") pod \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.684822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" (UID: "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.684983 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghjrr\" (UniqueName: \"kubernetes.io/projected/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-kube-api-access-ghjrr\") pod \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.685029 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-run-httpd\") pod \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.685048 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-scripts\") pod \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.685097 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-config-data\") pod \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\" (UID: \"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a\") " Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.685894 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" (UID: "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.690600 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.690629 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.700141 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-scripts" (OuterVolumeSpecName: "scripts") pod "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" (UID: "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.710282 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-kube-api-access-ghjrr" (OuterVolumeSpecName: "kube-api-access-ghjrr") pod "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" (UID: "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a"). InnerVolumeSpecName "kube-api-access-ghjrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.775304 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" (UID: "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.794561 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghjrr\" (UniqueName: \"kubernetes.io/projected/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-kube-api-access-ghjrr\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.794601 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.794612 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.833881 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" (UID: "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.896545 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.931685 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-config-data" (OuterVolumeSpecName: "config-data") pod "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" (UID: "b9527d36-6dd8-46ff-9d44-ff87bc79ee0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:14 crc kubenswrapper[4792]: I0318 16:02:14.999122 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.500126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9527d36-6dd8-46ff-9d44-ff87bc79ee0a","Type":"ContainerDied","Data":"81e6b7f34ab051ce05653e995b5f6f441931e5d2f95626202b4084be50fd761e"} Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.500168 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.500509 4792 scope.go:117] "RemoveContainer" containerID="e3c0df205c3858808c62ee66771b1014bf467098c84899489157bbfd42dffbf7" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.543383 4792 scope.go:117] "RemoveContainer" containerID="63adba061dac334662a4d1e14c3584be5c0d8d09a223606d7a5d44d229cb80bf" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.575040 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.629235 4792 scope.go:117] "RemoveContainer" containerID="6d55978b902f66200b8c4a43c7e7206443af90a367cdf050790e3f9dd1019152" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.633015 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.672795 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:15 crc kubenswrapper[4792]: E0318 16:02:15.673368 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-central-agent" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673387 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-central-agent" Mar 18 16:02:15 crc kubenswrapper[4792]: E0318 16:02:15.673408 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="sg-core" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673415 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="sg-core" Mar 18 16:02:15 crc kubenswrapper[4792]: E0318 16:02:15.673433 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="proxy-httpd" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673438 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="proxy-httpd" Mar 18 16:02:15 crc kubenswrapper[4792]: E0318 16:02:15.673484 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-notification-agent" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673491 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-notification-agent" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673700 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="proxy-httpd" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673718 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="sg-core" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673734 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-central-agent" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.673753 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" containerName="ceilometer-notification-agent" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.675884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.683328 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.694658 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.694888 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.697919 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715458 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-run-httpd\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-config-data\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715534 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715555 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtm8\" (UniqueName: \"kubernetes.io/projected/9f38eac4-0492-457a-b865-410ff4595c18-kube-api-access-gjtm8\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715603 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-log-httpd\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715643 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.715742 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-scripts\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.757911 4792 scope.go:117] "RemoveContainer" containerID="1b83e3e4ff459559f8cb1164013e6efa50412db5ffd9b614525693d4e5083300" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.817558 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.817637 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-scripts\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.817782 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-run-httpd\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.817826 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-config-data\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.817848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.817883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtm8\" (UniqueName: \"kubernetes.io/projected/9f38eac4-0492-457a-b865-410ff4595c18-kube-api-access-gjtm8\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.817984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-log-httpd\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.818054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.822109 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-log-httpd\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.822229 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-run-httpd\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.825521 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.826016 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-config-data\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.827301 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-scripts\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.827514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.827937 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.842312 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtm8\" (UniqueName: \"kubernetes.io/projected/9f38eac4-0492-457a-b865-410ff4595c18-kube-api-access-gjtm8\") pod \"ceilometer-0\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " pod="openstack/ceilometer-0" Mar 18 16:02:15 crc kubenswrapper[4792]: I0318 16:02:15.867831 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9527d36-6dd8-46ff-9d44-ff87bc79ee0a" path="/var/lib/kubelet/pods/b9527d36-6dd8-46ff-9d44-ff87bc79ee0a/volumes" Mar 18 16:02:16 crc kubenswrapper[4792]: I0318 16:02:16.036716 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:16 crc kubenswrapper[4792]: W0318 16:02:16.527676 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f38eac4_0492_457a_b865_410ff4595c18.slice/crio-147e59f0a4785961aea124816858f52504e8b55cb539c3780da16da204bdb176 WatchSource:0}: Error finding container 147e59f0a4785961aea124816858f52504e8b55cb539c3780da16da204bdb176: Status 404 returned error can't find the container with id 147e59f0a4785961aea124816858f52504e8b55cb539c3780da16da204bdb176 Mar 18 16:02:16 crc kubenswrapper[4792]: I0318 16:02:16.536122 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:17 crc kubenswrapper[4792]: I0318 16:02:17.529675 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerStarted","Data":"147e59f0a4785961aea124816858f52504e8b55cb539c3780da16da204bdb176"} Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.463528 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-pfgr6"] Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.474354 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-pfgr6"] Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.544361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerStarted","Data":"cebd13e2e41f51231399bf694203239777312cd5472c9fc251408fa64688c206"} Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.544403 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerStarted","Data":"fbf9e0df3b873a84675cc6a47e309bb369a05930a8c6306060c4dc60872e9dc1"} Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.584535 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-x8s6w"] Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.586914 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.597851 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-x8s6w"] Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.690266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2pmf\" (UniqueName: \"kubernetes.io/projected/395511e9-6a0e-4101-8e72-87a46bf1218f-kube-api-access-f2pmf\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.690672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-combined-ca-bundle\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.690841 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-config-data\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.793195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2pmf\" (UniqueName: \"kubernetes.io/projected/395511e9-6a0e-4101-8e72-87a46bf1218f-kube-api-access-f2pmf\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.793678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-combined-ca-bundle\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.793814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-config-data\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.798480 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-config-data\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.801140 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-combined-ca-bundle\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.822860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2pmf\" (UniqueName: \"kubernetes.io/projected/395511e9-6a0e-4101-8e72-87a46bf1218f-kube-api-access-f2pmf\") pod \"heat-db-sync-x8s6w\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:18 crc kubenswrapper[4792]: I0318 16:02:18.913303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x8s6w" Mar 18 16:02:19 crc kubenswrapper[4792]: I0318 16:02:19.680107 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-x8s6w"] Mar 18 16:02:19 crc kubenswrapper[4792]: I0318 16:02:19.888535 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379ff25e-6c0a-45d4-a478-87a5e136aa47" path="/var/lib/kubelet/pods/379ff25e-6c0a-45d4-a478-87a5e136aa47/volumes" Mar 18 16:02:20 crc kubenswrapper[4792]: I0318 16:02:20.583731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x8s6w" event={"ID":"395511e9-6a0e-4101-8e72-87a46bf1218f","Type":"ContainerStarted","Data":"a15054cfc9dad8686249e9d3a762a83d265245d1a0438968918715afdac78ff5"} Mar 18 16:02:20 crc kubenswrapper[4792]: I0318 16:02:20.591255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerStarted","Data":"9b5e1a1115782ecf385e0d02375dfc7bb2e18a0efcbfd52ae2f30b38999a1a54"} Mar 18 16:02:21 crc kubenswrapper[4792]: I0318 16:02:21.588822 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 16:02:22 crc kubenswrapper[4792]: I0318 16:02:22.100678 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 16:02:22 crc kubenswrapper[4792]: I0318 16:02:22.640562 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:02:23 crc kubenswrapper[4792]: I0318 16:02:23.673265 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerStarted","Data":"620ede18df5dd7316b5c5e2686f0e33992c731a951b613cc8e7478d3f8a2121e"} Mar 18 16:02:23 crc kubenswrapper[4792]: I0318 16:02:23.675227 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:02:23 crc kubenswrapper[4792]: I0318 16:02:23.734118 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.794631347 podStartE2EDuration="8.734098937s" podCreationTimestamp="2026-03-18 16:02:15 +0000 UTC" firstStartedPulling="2026-03-18 16:02:16.532166759 +0000 UTC m=+1685.401495696" lastFinishedPulling="2026-03-18 16:02:22.471634349 +0000 UTC m=+1691.340963286" observedRunningTime="2026-03-18 16:02:23.73063058 +0000 UTC m=+1692.599959547" watchObservedRunningTime="2026-03-18 16:02:23.734098937 +0000 UTC m=+1692.603427874" Mar 18 16:02:25 crc kubenswrapper[4792]: I0318 16:02:25.397298 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:26 crc kubenswrapper[4792]: I0318 16:02:26.725359 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-central-agent" containerID="cri-o://fbf9e0df3b873a84675cc6a47e309bb369a05930a8c6306060c4dc60872e9dc1" gracePeriod=30 Mar 18 16:02:26 crc kubenswrapper[4792]: I0318 16:02:26.725431 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="sg-core" containerID="cri-o://9b5e1a1115782ecf385e0d02375dfc7bb2e18a0efcbfd52ae2f30b38999a1a54" gracePeriod=30 Mar 18 16:02:26 crc kubenswrapper[4792]: I0318 16:02:26.725464 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="proxy-httpd" containerID="cri-o://620ede18df5dd7316b5c5e2686f0e33992c731a951b613cc8e7478d3f8a2121e" gracePeriod=30 Mar 18 16:02:26 crc kubenswrapper[4792]: I0318 16:02:26.725477 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-notification-agent" containerID="cri-o://cebd13e2e41f51231399bf694203239777312cd5472c9fc251408fa64688c206" gracePeriod=30 Mar 18 16:02:26 crc kubenswrapper[4792]: I0318 16:02:26.854654 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:02:26 crc kubenswrapper[4792]: E0318 16:02:26.854988 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:02:27 crc kubenswrapper[4792]: I0318 16:02:27.741847 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f38eac4-0492-457a-b865-410ff4595c18" containerID="620ede18df5dd7316b5c5e2686f0e33992c731a951b613cc8e7478d3f8a2121e" exitCode=0 Mar 18 16:02:27 crc kubenswrapper[4792]: I0318 16:02:27.742210 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f38eac4-0492-457a-b865-410ff4595c18" containerID="9b5e1a1115782ecf385e0d02375dfc7bb2e18a0efcbfd52ae2f30b38999a1a54" exitCode=2 Mar 18 16:02:27 crc kubenswrapper[4792]: I0318 16:02:27.742223 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f38eac4-0492-457a-b865-410ff4595c18" containerID="cebd13e2e41f51231399bf694203239777312cd5472c9fc251408fa64688c206" exitCode=0 Mar 18 16:02:27 crc kubenswrapper[4792]: I0318 16:02:27.741937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerDied","Data":"620ede18df5dd7316b5c5e2686f0e33992c731a951b613cc8e7478d3f8a2121e"} Mar 18 16:02:27 crc kubenswrapper[4792]: I0318 16:02:27.742269 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerDied","Data":"9b5e1a1115782ecf385e0d02375dfc7bb2e18a0efcbfd52ae2f30b38999a1a54"} Mar 18 16:02:27 crc kubenswrapper[4792]: I0318 16:02:27.742285 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerDied","Data":"cebd13e2e41f51231399bf694203239777312cd5472c9fc251408fa64688c206"} Mar 18 16:02:28 crc kubenswrapper[4792]: I0318 16:02:28.238199 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerName="rabbitmq" containerID="cri-o://0f0e8ea90dacfdde716a4095e4ec74aed9c59ca9aa529d0bf412ff84ee2c9f76" gracePeriod=604794 Mar 18 16:02:28 crc kubenswrapper[4792]: I0318 16:02:28.763018 4792 generic.go:334] "Generic (PLEG): container finished" podID="9f38eac4-0492-457a-b865-410ff4595c18" containerID="fbf9e0df3b873a84675cc6a47e309bb369a05930a8c6306060c4dc60872e9dc1" exitCode=0 Mar 18 16:02:28 crc kubenswrapper[4792]: I0318 16:02:28.763114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerDied","Data":"fbf9e0df3b873a84675cc6a47e309bb369a05930a8c6306060c4dc60872e9dc1"} Mar 18 16:02:28 crc kubenswrapper[4792]: I0318 16:02:28.998159 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerName="rabbitmq" containerID="cri-o://a73fb5d896401b251fb4a68c23fce2bf038ce7ca13f630624079db0959990df1" gracePeriod=604794 Mar 18 16:02:31 crc kubenswrapper[4792]: I0318 16:02:31.967405 4792 scope.go:117] "RemoveContainer" containerID="fa9bbc577dbb39ebc10f18bbba3367106c73587c29f277b3b0b30ef4c1139b9d" Mar 18 16:02:32 crc kubenswrapper[4792]: I0318 16:02:32.153289 4792 scope.go:117] "RemoveContainer" containerID="958144aac3c9104f251ccc54347eb7d0380a5b5ae9a7fa6ae97b8d0da691a98d" Mar 18 16:02:34 crc kubenswrapper[4792]: I0318 16:02:34.860332 4792 generic.go:334] "Generic (PLEG): container finished" podID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerID="0f0e8ea90dacfdde716a4095e4ec74aed9c59ca9aa529d0bf412ff84ee2c9f76" exitCode=0 Mar 18 16:02:34 crc kubenswrapper[4792]: I0318 16:02:34.860456 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"be818cb3-6cf1-4945-a96e-25c124ed1098","Type":"ContainerDied","Data":"0f0e8ea90dacfdde716a4095e4ec74aed9c59ca9aa529d0bf412ff84ee2c9f76"} Mar 18 16:02:35 crc kubenswrapper[4792]: E0318 16:02:35.558110 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0987841_aa1a_4130_a8e9_aeab1ba7aade.slice/crio-conmon-a73fb5d896401b251fb4a68c23fce2bf038ce7ca13f630624079db0959990df1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0987841_aa1a_4130_a8e9_aeab1ba7aade.slice/crio-a73fb5d896401b251fb4a68c23fce2bf038ce7ca13f630624079db0959990df1.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:02:35 crc kubenswrapper[4792]: I0318 16:02:35.876275 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerID="a73fb5d896401b251fb4a68c23fce2bf038ce7ca13f630624079db0959990df1" exitCode=0 Mar 18 16:02:35 crc kubenswrapper[4792]: I0318 16:02:35.876394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0987841-aa1a-4130-a8e9-aeab1ba7aade","Type":"ContainerDied","Data":"a73fb5d896401b251fb4a68c23fce2bf038ce7ca13f630624079db0959990df1"} Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.180293 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.356500 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtm8\" (UniqueName: \"kubernetes.io/projected/9f38eac4-0492-457a-b865-410ff4595c18-kube-api-access-gjtm8\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.357110 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-sg-core-conf-yaml\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.357283 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-scripts\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.357338 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-log-httpd\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.357371 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-combined-ca-bundle\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.357498 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-ceilometer-tls-certs\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.357539 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-run-httpd\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.357562 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-config-data\") pod \"9f38eac4-0492-457a-b865-410ff4595c18\" (UID: \"9f38eac4-0492-457a-b865-410ff4595c18\") " Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.358222 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.358518 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.358902 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.358921 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f38eac4-0492-457a-b865-410ff4595c18-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.368168 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f38eac4-0492-457a-b865-410ff4595c18-kube-api-access-gjtm8" (OuterVolumeSpecName: "kube-api-access-gjtm8") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "kube-api-access-gjtm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.381008 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-scripts" (OuterVolumeSpecName: "scripts") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.455193 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.468113 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.468153 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.468171 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjtm8\" (UniqueName: \"kubernetes.io/projected/9f38eac4-0492-457a-b865-410ff4595c18-kube-api-access-gjtm8\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.560657 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.575574 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.671023 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.678142 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.686355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-config-data" (OuterVolumeSpecName: "config-data") pod "9f38eac4-0492-457a-b865-410ff4595c18" (UID: "9f38eac4-0492-457a-b865-410ff4595c18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.780199 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f38eac4-0492-457a-b865-410ff4595c18-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.897121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f38eac4-0492-457a-b865-410ff4595c18","Type":"ContainerDied","Data":"147e59f0a4785961aea124816858f52504e8b55cb539c3780da16da204bdb176"} Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.897194 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.897207 4792 scope.go:117] "RemoveContainer" containerID="620ede18df5dd7316b5c5e2686f0e33992c731a951b613cc8e7478d3f8a2121e" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.950782 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.981266 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.993101 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:36 crc kubenswrapper[4792]: E0318 16:02:36.993840 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="proxy-httpd" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.993862 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="proxy-httpd" Mar 18 16:02:36 crc kubenswrapper[4792]: E0318 16:02:36.993905 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-central-agent" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.993914 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-central-agent" Mar 18 16:02:36 crc kubenswrapper[4792]: E0318 16:02:36.993937 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-notification-agent" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.993946 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-notification-agent" Mar 18 16:02:36 crc kubenswrapper[4792]: E0318 16:02:36.993990 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="sg-core" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.993997 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="sg-core" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.994284 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="sg-core" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.994313 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-central-agent" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.994323 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="proxy-httpd" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.994351 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f38eac4-0492-457a-b865-410ff4595c18" containerName="ceilometer-notification-agent" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.997286 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:36 crc kubenswrapper[4792]: I0318 16:02:36.999832 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:36.999989 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.004946 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.008132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.087550 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ngnh\" (UniqueName: \"kubernetes.io/projected/ec93232a-54d0-42a4-a659-ed6fc86913c6-kube-api-access-4ngnh\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.087608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-config-data\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.087664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec93232a-54d0-42a4-a659-ed6fc86913c6-log-httpd\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.087682 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.087749 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-scripts\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.087981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.088010 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.088030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec93232a-54d0-42a4-a659-ed6fc86913c6-run-httpd\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.190751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.190806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.190832 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec93232a-54d0-42a4-a659-ed6fc86913c6-run-httpd\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.191623 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec93232a-54d0-42a4-a659-ed6fc86913c6-run-httpd\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.191805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ngnh\" (UniqueName: \"kubernetes.io/projected/ec93232a-54d0-42a4-a659-ed6fc86913c6-kube-api-access-4ngnh\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.191850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-config-data\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.191954 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec93232a-54d0-42a4-a659-ed6fc86913c6-log-httpd\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.192026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.192161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-scripts\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.192385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec93232a-54d0-42a4-a659-ed6fc86913c6-log-httpd\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.195424 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.196790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.197114 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-scripts\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.197532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.211592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec93232a-54d0-42a4-a659-ed6fc86913c6-config-data\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.220853 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ngnh\" (UniqueName: \"kubernetes.io/projected/ec93232a-54d0-42a4-a659-ed6fc86913c6-kube-api-access-4ngnh\") pod \"ceilometer-0\" (UID: \"ec93232a-54d0-42a4-a659-ed6fc86913c6\") " pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.343886 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 16:02:37 crc kubenswrapper[4792]: I0318 16:02:37.871227 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f38eac4-0492-457a-b865-410ff4595c18" path="/var/lib/kubelet/pods/9f38eac4-0492-457a-b865-410ff4595c18/volumes" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.437057 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: i/o timeout" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.545238 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-r4krh"] Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.548206 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.551366 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.581683 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-r4krh"] Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.717428 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.717499 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.717591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6kc7\" (UniqueName: \"kubernetes.io/projected/e6136483-c173-41ff-98e3-6c0472ee464f-kube-api-access-q6kc7\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.717674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-config\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.717729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.717796 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.717821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.820162 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-config\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.820266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.820350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.820376 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.820501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.820553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.820656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6kc7\" (UniqueName: \"kubernetes.io/projected/e6136483-c173-41ff-98e3-6c0472ee464f-kube-api-access-q6kc7\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.822770 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.822981 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.823007 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.823226 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-config\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.823356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.824046 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.849512 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6kc7\" (UniqueName: \"kubernetes.io/projected/e6136483-c173-41ff-98e3-6c0472ee464f-kube-api-access-q6kc7\") pod \"dnsmasq-dns-5b75489c6f-r4krh\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.863766 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:02:41 crc kubenswrapper[4792]: E0318 16:02:41.864084 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.894883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:41 crc kubenswrapper[4792]: I0318 16:02:41.918033 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: i/o timeout" Mar 18 16:02:42 crc kubenswrapper[4792]: E0318 16:02:42.050354 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 18 16:02:42 crc kubenswrapper[4792]: E0318 16:02:42.050420 4792 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 18 16:02:42 crc kubenswrapper[4792]: E0318 16:02:42.050888 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f2pmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-x8s6w_openstack(395511e9-6a0e-4101-8e72-87a46bf1218f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 16:02:42 crc kubenswrapper[4792]: E0318 16:02:42.052186 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-x8s6w" podUID="395511e9-6a0e-4101-8e72-87a46bf1218f" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.156660 4792 scope.go:117] "RemoveContainer" containerID="9b5e1a1115782ecf385e0d02375dfc7bb2e18a0efcbfd52ae2f30b38999a1a54" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.417080 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.428222 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.437441 4792 scope.go:117] "RemoveContainer" containerID="cebd13e2e41f51231399bf694203239777312cd5472c9fc251408fa64688c206" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.528307 4792 scope.go:117] "RemoveContainer" containerID="fbf9e0df3b873a84675cc6a47e309bb369a05930a8c6306060c4dc60872e9dc1" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.555453 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0987841-aa1a-4130-a8e9-aeab1ba7aade-erlang-cookie-secret\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.555556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8lkm\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-kube-api-access-h8lkm\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.555600 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-tls\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.555641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-confd\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.555692 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-erlang-cookie\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.556446 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.556480 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-config-data\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.556522 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be818cb3-6cf1-4945-a96e-25c124ed1098-erlang-cookie-secret\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.556608 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-plugins\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.556801 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-config-data\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.559868 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.566718 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-plugins-conf\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.567643 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.567731 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.567827 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-plugins-conf\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.567879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-tls\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.569268 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.569677 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzq9b\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-kube-api-access-vzq9b\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.573130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be818cb3-6cf1-4945-a96e-25c124ed1098-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.578494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-kube-api-access-h8lkm" (OuterVolumeSpecName: "kube-api-access-h8lkm") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "kube-api-access-h8lkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.583600 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-server-conf\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.584022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-server-conf\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.587771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.589135 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.590919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0987841-aa1a-4130-a8e9-aeab1ba7aade-pod-info\") pod \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\" (UID: \"e0987841-aa1a-4130-a8e9-aeab1ba7aade\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.591514 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be818cb3-6cf1-4945-a96e-25c124ed1098-pod-info\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.591569 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-erlang-cookie\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.591806 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-plugins\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.591874 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-confd\") pod \"be818cb3-6cf1-4945-a96e-25c124ed1098\" (UID: \"be818cb3-6cf1-4945-a96e-25c124ed1098\") " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.597141 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.597190 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be818cb3-6cf1-4945-a96e-25c124ed1098-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.597202 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.597214 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.597228 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.597239 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8lkm\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-kube-api-access-h8lkm\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.597250 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.598266 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.599582 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.608095 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e0987841-aa1a-4130-a8e9-aeab1ba7aade-pod-info" (OuterVolumeSpecName: "pod-info") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.608134 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-kube-api-access-vzq9b" (OuterVolumeSpecName: "kube-api-access-vzq9b") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "kube-api-access-vzq9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.608216 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0987841-aa1a-4130-a8e9-aeab1ba7aade-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.627827 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.648136 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/be818cb3-6cf1-4945-a96e-25c124ed1098-pod-info" (OuterVolumeSpecName: "pod-info") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.682016 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4" (OuterVolumeSpecName: "persistence") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.682250 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879" (OuterVolumeSpecName: "persistence") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "pvc-a993b93a-d248-4189-85a9-9a321b578879". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.691446 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-config-data" (OuterVolumeSpecName: "config-data") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.698085 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-config-data" (OuterVolumeSpecName: "config-data") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703781 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703822 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0987841-aa1a-4130-a8e9-aeab1ba7aade-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703869 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") on node \"crc\" " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703886 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703900 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703921 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") on node \"crc\" " Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703934 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703947 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzq9b\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-kube-api-access-vzq9b\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703959 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0987841-aa1a-4130-a8e9-aeab1ba7aade-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.703987 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be818cb3-6cf1-4945-a96e-25c124ed1098-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.704002 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.765637 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-server-conf" (OuterVolumeSpecName: "server-conf") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.767720 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.767929 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4") on node "crc" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.782041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-server-conf" (OuterVolumeSpecName: "server-conf") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.810743 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0987841-aa1a-4130-a8e9-aeab1ba7aade-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.810780 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be818cb3-6cf1-4945-a96e-25c124ed1098-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.810795 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.830016 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.830176 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a993b93a-d248-4189-85a9-9a321b578879" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879") on node "crc" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.866350 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e0987841-aa1a-4130-a8e9-aeab1ba7aade" (UID: "e0987841-aa1a-4130-a8e9-aeab1ba7aade"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.876135 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "be818cb3-6cf1-4945-a96e-25c124ed1098" (UID: "be818cb3-6cf1-4945-a96e-25c124ed1098"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.912639 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be818cb3-6cf1-4945-a96e-25c124ed1098-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.912909 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0987841-aa1a-4130-a8e9-aeab1ba7aade-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:42 crc kubenswrapper[4792]: I0318 16:02:42.913443 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.005389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"be818cb3-6cf1-4945-a96e-25c124ed1098","Type":"ContainerDied","Data":"64bf54b85b9e808b291fec80bc6b702efa5b395b9ee767c9f9bc8ee86a106adf"} Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.005462 4792 scope.go:117] "RemoveContainer" containerID="0f0e8ea90dacfdde716a4095e4ec74aed9c59ca9aa529d0bf412ff84ee2c9f76" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.005416 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.025133 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0987841-aa1a-4130-a8e9-aeab1ba7aade","Type":"ContainerDied","Data":"68a73a64d4dffe2611d6dc7d272e9515fc3d79e622f71e037de3b3d13ea8ffeb"} Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.025207 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: E0318 16:02:43.042922 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-x8s6w" podUID="395511e9-6a0e-4101-8e72-87a46bf1218f" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.075831 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.112136 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.146478 4792 scope.go:117] "RemoveContainer" containerID="8ca1e9174e1b68cb3a500c36721db12755c75144ada52226620997217d432a3c" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.189888 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 16:02:43 crc kubenswrapper[4792]: E0318 16:02:43.190931 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerName="setup-container" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.190958 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerName="setup-container" Mar 18 16:02:43 crc kubenswrapper[4792]: E0318 16:02:43.191005 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerName="rabbitmq" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.191018 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerName="rabbitmq" Mar 18 16:02:43 crc kubenswrapper[4792]: E0318 16:02:43.191040 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerName="rabbitmq" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.191071 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerName="rabbitmq" Mar 18 16:02:43 crc kubenswrapper[4792]: E0318 16:02:43.191103 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerName="setup-container" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.191112 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerName="setup-container" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.191573 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" containerName="rabbitmq" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.191610 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" containerName="rabbitmq" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.193949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.224951 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x7r5\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-kube-api-access-4x7r5\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225010 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-config-data\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225065 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225137 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225154 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-server-conf\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225283 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.225325 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-pod-info\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.234357 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.272120 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.287497 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.299178 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.310460 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-r4krh"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.321522 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.325696 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327359 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327471 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-server-conf\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-pod-info\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x7r5\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-kube-api-access-4x7r5\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-config-data\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327783 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327920 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.327950 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.328015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.328860 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.329036 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.329079 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-server-conf\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.329720 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.330772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.333067 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.333305 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.333565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.334074 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.334434 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.334564 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.336190 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.336242 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/87ee01b6bedb30e6fd03d8511a9d9616a8c5f390321a6aaea3aadb41dde33bb0/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.336322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-brgwk" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.337261 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-pod-info\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.337629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-config-data\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.338407 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.346669 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.357436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x7r5\" (UniqueName: \"kubernetes.io/projected/3a2e23b3-06c8-41e9-94d3-fa6fe815e906-kube-api-access-4x7r5\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.359596 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.385475 4792 scope.go:117] "RemoveContainer" containerID="a73fb5d896401b251fb4a68c23fce2bf038ce7ca13f630624079db0959990df1" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.429003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a993b93a-d248-4189-85a9-9a321b578879\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a993b93a-d248-4189-85a9-9a321b578879\") pod \"rabbitmq-server-2\" (UID: \"3a2e23b3-06c8-41e9-94d3-fa6fe815e906\") " pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.443553 4792 scope.go:117] "RemoveContainer" containerID="f74bb4cde5f8d03c3be1828e98466fefc661c6bd28ec4a480e92e59898e2c8fa" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.532925 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533023 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24b920d2-ca55-4f2d-a313-06cbe39c81b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533052 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24b920d2-ca55-4f2d-a313-06cbe39c81b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533211 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533325 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lz92\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-kube-api-access-5lz92\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533491 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.533515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635583 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635657 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24b920d2-ca55-4f2d-a313-06cbe39c81b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635723 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24b920d2-ca55-4f2d-a313-06cbe39c81b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635757 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635866 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.635909 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.636058 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lz92\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-kube-api-access-5lz92\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.637358 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.639111 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.639872 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.640086 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.640131 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24b920d2-ca55-4f2d-a313-06cbe39c81b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.640905 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.642305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24b920d2-ca55-4f2d-a313-06cbe39c81b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.642407 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24b920d2-ca55-4f2d-a313-06cbe39c81b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.642570 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.642627 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6aa12682446948c208a7df2de2f3e0d6fe0df3f4db75202487c5c3b9a696ecf4/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.645632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.656055 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lz92\" (UniqueName: \"kubernetes.io/projected/24b920d2-ca55-4f2d-a313-06cbe39c81b8-kube-api-access-5lz92\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.674548 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.716323 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad52f5f6-0bcc-4118-bb53-b5503e9beea4\") pod \"rabbitmq-cell1-server-0\" (UID: \"24b920d2-ca55-4f2d-a313-06cbe39c81b8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.975032 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be818cb3-6cf1-4945-a96e-25c124ed1098" path="/var/lib/kubelet/pods/be818cb3-6cf1-4945-a96e-25c124ed1098/volumes" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.977842 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0987841-aa1a-4130-a8e9-aeab1ba7aade" path="/var/lib/kubelet/pods/e0987841-aa1a-4130-a8e9-aeab1ba7aade/volumes" Mar 18 16:02:43 crc kubenswrapper[4792]: I0318 16:02:43.992349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:02:44 crc kubenswrapper[4792]: I0318 16:02:44.097206 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec93232a-54d0-42a4-a659-ed6fc86913c6","Type":"ContainerStarted","Data":"99aa4a18f841b9ac50f3f88d10c6a466d547a10913fce72061f802ecc2b3b8db"} Mar 18 16:02:44 crc kubenswrapper[4792]: I0318 16:02:44.112320 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6136483-c173-41ff-98e3-6c0472ee464f" containerID="b876576145bb9ed1da5a5558aa758f8c55f817ca489ffd15940130cbf4a3a229" exitCode=0 Mar 18 16:02:44 crc kubenswrapper[4792]: I0318 16:02:44.112484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" event={"ID":"e6136483-c173-41ff-98e3-6c0472ee464f","Type":"ContainerDied","Data":"b876576145bb9ed1da5a5558aa758f8c55f817ca489ffd15940130cbf4a3a229"} Mar 18 16:02:44 crc kubenswrapper[4792]: I0318 16:02:44.112516 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" event={"ID":"e6136483-c173-41ff-98e3-6c0472ee464f","Type":"ContainerStarted","Data":"6902fb4641c2809fedf406136db28c9e5fbbabb9ef58126cfb11bb6d48e6144f"} Mar 18 16:02:44 crc kubenswrapper[4792]: I0318 16:02:44.432888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 18 16:02:44 crc kubenswrapper[4792]: W0318 16:02:44.688930 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b920d2_ca55_4f2d_a313_06cbe39c81b8.slice/crio-07dfaf1e7f5bbd5c643e74630af4b7e276cf37b48ce5c2e5620850f01523817b WatchSource:0}: Error finding container 07dfaf1e7f5bbd5c643e74630af4b7e276cf37b48ce5c2e5620850f01523817b: Status 404 returned error can't find the container with id 07dfaf1e7f5bbd5c643e74630af4b7e276cf37b48ce5c2e5620850f01523817b Mar 18 16:02:44 crc kubenswrapper[4792]: I0318 16:02:44.692094 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 16:02:45 crc kubenswrapper[4792]: I0318 16:02:45.159312 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24b920d2-ca55-4f2d-a313-06cbe39c81b8","Type":"ContainerStarted","Data":"07dfaf1e7f5bbd5c643e74630af4b7e276cf37b48ce5c2e5620850f01523817b"} Mar 18 16:02:45 crc kubenswrapper[4792]: I0318 16:02:45.161984 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"3a2e23b3-06c8-41e9-94d3-fa6fe815e906","Type":"ContainerStarted","Data":"4d0bcb44064762e1543c6980f40e7aa8218c091766e7e97605e66869cddda670"} Mar 18 16:02:45 crc kubenswrapper[4792]: I0318 16:02:45.165628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" event={"ID":"e6136483-c173-41ff-98e3-6c0472ee464f","Type":"ContainerStarted","Data":"4ebea08edaca8a4a6d08983b6f3f825649307662de043611af82327a1934832e"} Mar 18 16:02:45 crc kubenswrapper[4792]: I0318 16:02:45.165803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:45 crc kubenswrapper[4792]: I0318 16:02:45.199316 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" podStartSLOduration=4.199298767 podStartE2EDuration="4.199298767s" podCreationTimestamp="2026-03-18 16:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:45.196063196 +0000 UTC m=+1714.065392153" watchObservedRunningTime="2026-03-18 16:02:45.199298767 +0000 UTC m=+1714.068627704" Mar 18 16:02:47 crc kubenswrapper[4792]: I0318 16:02:47.192929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"3a2e23b3-06c8-41e9-94d3-fa6fe815e906","Type":"ContainerStarted","Data":"350c8a7785782e97d0cca720cef383c6bdacd3971a89e9eb38fc9e1b22607900"} Mar 18 16:02:47 crc kubenswrapper[4792]: I0318 16:02:47.195677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24b920d2-ca55-4f2d-a313-06cbe39c81b8","Type":"ContainerStarted","Data":"8d96efcb967ef891f55ef50a32a4c69e0f368b2c210ec6d95bb2eca793bd0e7f"} Mar 18 16:02:49 crc kubenswrapper[4792]: I0318 16:02:49.221751 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec93232a-54d0-42a4-a659-ed6fc86913c6","Type":"ContainerStarted","Data":"b9ca72fee08cecac34ea630ed28653def4e27ceec992fc19ffc782d6dc32d4a3"} Mar 18 16:02:50 crc kubenswrapper[4792]: I0318 16:02:50.233849 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec93232a-54d0-42a4-a659-ed6fc86913c6","Type":"ContainerStarted","Data":"ca6c31d284f7f40e2970a5276d63db14e683bd2e344c811d5fe5423d492ae4f5"} Mar 18 16:02:51 crc kubenswrapper[4792]: I0318 16:02:51.897011 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:02:51 crc kubenswrapper[4792]: I0318 16:02:51.983644 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-vs6l6"] Mar 18 16:02:51 crc kubenswrapper[4792]: I0318 16:02:51.983875 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" podUID="40fc9569-2619-4401-905f-7f39df040ecb" containerName="dnsmasq-dns" containerID="cri-o://89c262d9a910bfee5d80883e1dba821067ea03428179bed79bb67704babc339a" gracePeriod=10 Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.231846 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-xp9vl"] Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.237884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.279267 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.279542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.279628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.279652 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-config\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.279697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.279724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.279785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbkgd\" (UniqueName: \"kubernetes.io/projected/4e9dff47-8cda-4561-8f7e-d381ad180ea6-kube-api-access-fbkgd\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.293265 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-xp9vl"] Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.305052 4792 generic.go:334] "Generic (PLEG): container finished" podID="40fc9569-2619-4401-905f-7f39df040ecb" containerID="89c262d9a910bfee5d80883e1dba821067ea03428179bed79bb67704babc339a" exitCode=0 Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.305121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" event={"ID":"40fc9569-2619-4401-905f-7f39df040ecb","Type":"ContainerDied","Data":"89c262d9a910bfee5d80883e1dba821067ea03428179bed79bb67704babc339a"} Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.381708 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.381753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-config\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.381830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.381859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.381979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbkgd\" (UniqueName: \"kubernetes.io/projected/4e9dff47-8cda-4561-8f7e-d381ad180ea6-kube-api-access-fbkgd\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.382076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.382154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.382689 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.382844 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.383391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.383747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.384105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.384244 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9dff47-8cda-4561-8f7e-d381ad180ea6-config\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.412650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbkgd\" (UniqueName: \"kubernetes.io/projected/4e9dff47-8cda-4561-8f7e-d381ad180ea6-kube-api-access-fbkgd\") pod \"dnsmasq-dns-5d75f767dc-xp9vl\" (UID: \"4e9dff47-8cda-4561-8f7e-d381ad180ea6\") " pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.578094 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.792629 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.808893 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-sb\") pod \"40fc9569-2619-4401-905f-7f39df040ecb\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.808957 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-nb\") pod \"40fc9569-2619-4401-905f-7f39df040ecb\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.809020 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-config\") pod \"40fc9569-2619-4401-905f-7f39df040ecb\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.809174 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsptf\" (UniqueName: \"kubernetes.io/projected/40fc9569-2619-4401-905f-7f39df040ecb-kube-api-access-fsptf\") pod \"40fc9569-2619-4401-905f-7f39df040ecb\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.809327 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-swift-storage-0\") pod \"40fc9569-2619-4401-905f-7f39df040ecb\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.809507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-svc\") pod \"40fc9569-2619-4401-905f-7f39df040ecb\" (UID: \"40fc9569-2619-4401-905f-7f39df040ecb\") " Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.827291 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40fc9569-2619-4401-905f-7f39df040ecb-kube-api-access-fsptf" (OuterVolumeSpecName: "kube-api-access-fsptf") pod "40fc9569-2619-4401-905f-7f39df040ecb" (UID: "40fc9569-2619-4401-905f-7f39df040ecb"). InnerVolumeSpecName "kube-api-access-fsptf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.913234 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsptf\" (UniqueName: \"kubernetes.io/projected/40fc9569-2619-4401-905f-7f39df040ecb-kube-api-access-fsptf\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.915108 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40fc9569-2619-4401-905f-7f39df040ecb" (UID: "40fc9569-2619-4401-905f-7f39df040ecb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.941048 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-config" (OuterVolumeSpecName: "config") pod "40fc9569-2619-4401-905f-7f39df040ecb" (UID: "40fc9569-2619-4401-905f-7f39df040ecb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.942538 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "40fc9569-2619-4401-905f-7f39df040ecb" (UID: "40fc9569-2619-4401-905f-7f39df040ecb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.943902 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40fc9569-2619-4401-905f-7f39df040ecb" (UID: "40fc9569-2619-4401-905f-7f39df040ecb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:52 crc kubenswrapper[4792]: I0318 16:02:52.972582 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "40fc9569-2619-4401-905f-7f39df040ecb" (UID: "40fc9569-2619-4401-905f-7f39df040ecb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.017095 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.017138 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.017152 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.017163 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.017175 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40fc9569-2619-4401-905f-7f39df040ecb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.199077 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-xp9vl"] Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.331011 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" event={"ID":"40fc9569-2619-4401-905f-7f39df040ecb","Type":"ContainerDied","Data":"17bc40311c9c163fac4f46df28616bd35ef49185d75b4df718a892850a559727"} Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.331103 4792 scope.go:117] "RemoveContainer" containerID="89c262d9a910bfee5d80883e1dba821067ea03428179bed79bb67704babc339a" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.331406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-vs6l6" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.337011 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" event={"ID":"4e9dff47-8cda-4561-8f7e-d381ad180ea6","Type":"ContainerStarted","Data":"366a775e2ad265f799cd85e6c8e62e4bfcafb05e64710fe317883b0dfcaf9434"} Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.455245 4792 scope.go:117] "RemoveContainer" containerID="8ca544437284785d8d1298a4681da252c571b07a9d2044d3b570597a54ca847e" Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.487926 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-vs6l6"] Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.504860 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-vs6l6"] Mar 18 16:02:53 crc kubenswrapper[4792]: I0318 16:02:53.868554 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40fc9569-2619-4401-905f-7f39df040ecb" path="/var/lib/kubelet/pods/40fc9569-2619-4401-905f-7f39df040ecb/volumes" Mar 18 16:02:54 crc kubenswrapper[4792]: I0318 16:02:54.359779 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e9dff47-8cda-4561-8f7e-d381ad180ea6" containerID="753bde6da995cc778018bef6f1ede681251c260d428c3fcc73496cd787d405cb" exitCode=0 Mar 18 16:02:54 crc kubenswrapper[4792]: I0318 16:02:54.360453 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" event={"ID":"4e9dff47-8cda-4561-8f7e-d381ad180ea6","Type":"ContainerDied","Data":"753bde6da995cc778018bef6f1ede681251c260d428c3fcc73496cd787d405cb"} Mar 18 16:02:54 crc kubenswrapper[4792]: I0318 16:02:54.854841 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:02:54 crc kubenswrapper[4792]: E0318 16:02:54.855344 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:02:55 crc kubenswrapper[4792]: I0318 16:02:55.411340 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" event={"ID":"4e9dff47-8cda-4561-8f7e-d381ad180ea6","Type":"ContainerStarted","Data":"5f544afa6d814bcf4510d1882e8aa91ea3b16ea39b2ca0cb29cb7199bf127c05"} Mar 18 16:02:55 crc kubenswrapper[4792]: I0318 16:02:55.411915 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:02:55 crc kubenswrapper[4792]: I0318 16:02:55.414688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec93232a-54d0-42a4-a659-ed6fc86913c6","Type":"ContainerStarted","Data":"7c5de93c1fc79b3bd6da5c1f155afb766d9dedaa9a5e1516ba618a7360d0edfd"} Mar 18 16:02:55 crc kubenswrapper[4792]: I0318 16:02:55.435198 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" podStartSLOduration=3.435176515 podStartE2EDuration="3.435176515s" podCreationTimestamp="2026-03-18 16:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:02:55.431229924 +0000 UTC m=+1724.300558861" watchObservedRunningTime="2026-03-18 16:02:55.435176515 +0000 UTC m=+1724.304505452" Mar 18 16:02:57 crc kubenswrapper[4792]: I0318 16:02:57.459174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec93232a-54d0-42a4-a659-ed6fc86913c6","Type":"ContainerStarted","Data":"9dbb3f34dc92b178175ada096f1c3cc33775a328707775b26a7b358d0867a0db"} Mar 18 16:02:57 crc kubenswrapper[4792]: I0318 16:02:57.459573 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 16:02:57 crc kubenswrapper[4792]: I0318 16:02:57.462513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x8s6w" event={"ID":"395511e9-6a0e-4101-8e72-87a46bf1218f","Type":"ContainerStarted","Data":"7ac2b924412360c31fdb7b224d2357e7bbc46b8a97725628fd5c83eb408a5065"} Mar 18 16:02:57 crc kubenswrapper[4792]: I0318 16:02:57.507523 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.907801111 podStartE2EDuration="21.507498149s" podCreationTimestamp="2026-03-18 16:02:36 +0000 UTC" firstStartedPulling="2026-03-18 16:02:43.160236691 +0000 UTC m=+1712.029565628" lastFinishedPulling="2026-03-18 16:02:56.759933729 +0000 UTC m=+1725.629262666" observedRunningTime="2026-03-18 16:02:57.495410616 +0000 UTC m=+1726.364739563" watchObservedRunningTime="2026-03-18 16:02:57.507498149 +0000 UTC m=+1726.376827086" Mar 18 16:02:57 crc kubenswrapper[4792]: I0318 16:02:57.528266 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-x8s6w" podStartSLOduration=2.202625785 podStartE2EDuration="39.52824018s" podCreationTimestamp="2026-03-18 16:02:18 +0000 UTC" firstStartedPulling="2026-03-18 16:02:19.721154891 +0000 UTC m=+1688.590483828" lastFinishedPulling="2026-03-18 16:02:57.046769286 +0000 UTC m=+1725.916098223" observedRunningTime="2026-03-18 16:02:57.514350581 +0000 UTC m=+1726.383679528" watchObservedRunningTime="2026-03-18 16:02:57.52824018 +0000 UTC m=+1726.397569117" Mar 18 16:03:00 crc kubenswrapper[4792]: I0318 16:03:00.509986 4792 generic.go:334] "Generic (PLEG): container finished" podID="395511e9-6a0e-4101-8e72-87a46bf1218f" containerID="7ac2b924412360c31fdb7b224d2357e7bbc46b8a97725628fd5c83eb408a5065" exitCode=0 Mar 18 16:03:00 crc kubenswrapper[4792]: I0318 16:03:00.510223 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x8s6w" event={"ID":"395511e9-6a0e-4101-8e72-87a46bf1218f","Type":"ContainerDied","Data":"7ac2b924412360c31fdb7b224d2357e7bbc46b8a97725628fd5c83eb408a5065"} Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.044138 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x8s6w" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.106681 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-config-data\") pod \"395511e9-6a0e-4101-8e72-87a46bf1218f\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.107081 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2pmf\" (UniqueName: \"kubernetes.io/projected/395511e9-6a0e-4101-8e72-87a46bf1218f-kube-api-access-f2pmf\") pod \"395511e9-6a0e-4101-8e72-87a46bf1218f\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.107169 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-combined-ca-bundle\") pod \"395511e9-6a0e-4101-8e72-87a46bf1218f\" (UID: \"395511e9-6a0e-4101-8e72-87a46bf1218f\") " Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.113095 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395511e9-6a0e-4101-8e72-87a46bf1218f-kube-api-access-f2pmf" (OuterVolumeSpecName: "kube-api-access-f2pmf") pod "395511e9-6a0e-4101-8e72-87a46bf1218f" (UID: "395511e9-6a0e-4101-8e72-87a46bf1218f"). InnerVolumeSpecName "kube-api-access-f2pmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.141451 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "395511e9-6a0e-4101-8e72-87a46bf1218f" (UID: "395511e9-6a0e-4101-8e72-87a46bf1218f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.206278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-config-data" (OuterVolumeSpecName: "config-data") pod "395511e9-6a0e-4101-8e72-87a46bf1218f" (UID: "395511e9-6a0e-4101-8e72-87a46bf1218f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.211020 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.211106 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2pmf\" (UniqueName: \"kubernetes.io/projected/395511e9-6a0e-4101-8e72-87a46bf1218f-kube-api-access-f2pmf\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.211121 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/395511e9-6a0e-4101-8e72-87a46bf1218f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.535859 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x8s6w" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.535872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x8s6w" event={"ID":"395511e9-6a0e-4101-8e72-87a46bf1218f","Type":"ContainerDied","Data":"a15054cfc9dad8686249e9d3a762a83d265245d1a0438968918715afdac78ff5"} Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.536286 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a15054cfc9dad8686249e9d3a762a83d265245d1a0438968918715afdac78ff5" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.581392 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-xp9vl" Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.699836 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-r4krh"] Mar 18 16:03:02 crc kubenswrapper[4792]: I0318 16:03:02.700252 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" podUID="e6136483-c173-41ff-98e3-6c0472ee464f" containerName="dnsmasq-dns" containerID="cri-o://4ebea08edaca8a4a6d08983b6f3f825649307662de043611af82327a1934832e" gracePeriod=10 Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.556420 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6136483-c173-41ff-98e3-6c0472ee464f" containerID="4ebea08edaca8a4a6d08983b6f3f825649307662de043611af82327a1934832e" exitCode=0 Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.556777 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" event={"ID":"e6136483-c173-41ff-98e3-6c0472ee464f","Type":"ContainerDied","Data":"4ebea08edaca8a4a6d08983b6f3f825649307662de043611af82327a1934832e"} Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.556813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" event={"ID":"e6136483-c173-41ff-98e3-6c0472ee464f","Type":"ContainerDied","Data":"6902fb4641c2809fedf406136db28c9e5fbbabb9ef58126cfb11bb6d48e6144f"} Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.556828 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6902fb4641c2809fedf406136db28c9e5fbbabb9ef58126cfb11bb6d48e6144f" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.568729 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.692740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6kc7\" (UniqueName: \"kubernetes.io/projected/e6136483-c173-41ff-98e3-6c0472ee464f-kube-api-access-q6kc7\") pod \"e6136483-c173-41ff-98e3-6c0472ee464f\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.692852 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-openstack-edpm-ipam\") pod \"e6136483-c173-41ff-98e3-6c0472ee464f\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.692961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-svc\") pod \"e6136483-c173-41ff-98e3-6c0472ee464f\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.693083 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-config\") pod \"e6136483-c173-41ff-98e3-6c0472ee464f\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.693128 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-swift-storage-0\") pod \"e6136483-c173-41ff-98e3-6c0472ee464f\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.693245 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-sb\") pod \"e6136483-c173-41ff-98e3-6c0472ee464f\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.693383 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-nb\") pod \"e6136483-c173-41ff-98e3-6c0472ee464f\" (UID: \"e6136483-c173-41ff-98e3-6c0472ee464f\") " Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.730110 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6136483-c173-41ff-98e3-6c0472ee464f-kube-api-access-q6kc7" (OuterVolumeSpecName: "kube-api-access-q6kc7") pod "e6136483-c173-41ff-98e3-6c0472ee464f" (UID: "e6136483-c173-41ff-98e3-6c0472ee464f"). InnerVolumeSpecName "kube-api-access-q6kc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.767960 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e6136483-c173-41ff-98e3-6c0472ee464f" (UID: "e6136483-c173-41ff-98e3-6c0472ee464f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.771383 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "e6136483-c173-41ff-98e3-6c0472ee464f" (UID: "e6136483-c173-41ff-98e3-6c0472ee464f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.771767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-config" (OuterVolumeSpecName: "config") pod "e6136483-c173-41ff-98e3-6c0472ee464f" (UID: "e6136483-c173-41ff-98e3-6c0472ee464f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.791761 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6136483-c173-41ff-98e3-6c0472ee464f" (UID: "e6136483-c173-41ff-98e3-6c0472ee464f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.797310 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6kc7\" (UniqueName: \"kubernetes.io/projected/e6136483-c173-41ff-98e3-6c0472ee464f-kube-api-access-q6kc7\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.797343 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.797353 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.797383 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-config\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.797391 4792 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.817962 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6136483-c173-41ff-98e3-6c0472ee464f" (UID: "e6136483-c173-41ff-98e3-6c0472ee464f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.818070 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6136483-c173-41ff-98e3-6c0472ee464f" (UID: "e6136483-c173-41ff-98e3-6c0472ee464f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.901583 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:03 crc kubenswrapper[4792]: I0318 16:03:03.901647 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6136483-c173-41ff-98e3-6c0472ee464f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.478496 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7d5dfb5b8b-2znh8"] Mar 18 16:03:04 crc kubenswrapper[4792]: E0318 16:03:04.479458 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6136483-c173-41ff-98e3-6c0472ee464f" containerName="dnsmasq-dns" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479484 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6136483-c173-41ff-98e3-6c0472ee464f" containerName="dnsmasq-dns" Mar 18 16:03:04 crc kubenswrapper[4792]: E0318 16:03:04.479508 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6136483-c173-41ff-98e3-6c0472ee464f" containerName="init" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479517 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6136483-c173-41ff-98e3-6c0472ee464f" containerName="init" Mar 18 16:03:04 crc kubenswrapper[4792]: E0318 16:03:04.479536 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395511e9-6a0e-4101-8e72-87a46bf1218f" containerName="heat-db-sync" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479543 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="395511e9-6a0e-4101-8e72-87a46bf1218f" containerName="heat-db-sync" Mar 18 16:03:04 crc kubenswrapper[4792]: E0318 16:03:04.479572 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fc9569-2619-4401-905f-7f39df040ecb" containerName="init" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479581 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fc9569-2619-4401-905f-7f39df040ecb" containerName="init" Mar 18 16:03:04 crc kubenswrapper[4792]: E0318 16:03:04.479603 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fc9569-2619-4401-905f-7f39df040ecb" containerName="dnsmasq-dns" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479610 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fc9569-2619-4401-905f-7f39df040ecb" containerName="dnsmasq-dns" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479904 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="40fc9569-2619-4401-905f-7f39df040ecb" containerName="dnsmasq-dns" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479937 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6136483-c173-41ff-98e3-6c0472ee464f" containerName="dnsmasq-dns" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.479996 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="395511e9-6a0e-4101-8e72-87a46bf1218f" containerName="heat-db-sync" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.481091 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.490682 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7d5dfb5b8b-2znh8"] Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.527759 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-58f688bc9b-9h9n4"] Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.530009 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.572317 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-r4krh" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.579384 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58f688bc9b-9h9n4"] Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.609727 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-f8cb7866d-pk45f"] Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.611556 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630177 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-combined-ca-bundle\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630292 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-public-tls-certs\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630330 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-876jd\" (UniqueName: \"kubernetes.io/projected/46138c03-275f-46ea-b4d5-2947fcfe979c-kube-api-access-876jd\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630407 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-config-data-custom\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630463 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn7kc\" (UniqueName: \"kubernetes.io/projected/a03aec86-430b-4209-8f15-f7fb97d58276-kube-api-access-rn7kc\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630496 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-combined-ca-bundle\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-config-data\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-internal-tls-certs\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630852 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-config-data\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.630887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-config-data-custom\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.639405 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-f8cb7866d-pk45f"] Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.679823 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-r4krh"] Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.696482 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-r4krh"] Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733355 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-combined-ca-bundle\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-public-tls-certs\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733456 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-876jd\" (UniqueName: \"kubernetes.io/projected/46138c03-275f-46ea-b4d5-2947fcfe979c-kube-api-access-876jd\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733492 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-combined-ca-bundle\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-config-data-custom\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733565 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn7kc\" (UniqueName: \"kubernetes.io/projected/a03aec86-430b-4209-8f15-f7fb97d58276-kube-api-access-rn7kc\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733592 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-config-data-custom\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-combined-ca-bundle\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.733666 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-public-tls-certs\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.734608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-config-data\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.736252 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-config-data\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.736358 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-internal-tls-certs\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.736426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-internal-tls-certs\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.736485 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n6cj\" (UniqueName: \"kubernetes.io/projected/c8032f2a-13e3-4463-bab4-1b1d850e4b06-kube-api-access-2n6cj\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.736552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-config-data\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.736576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-config-data-custom\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.739746 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-combined-ca-bundle\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.739921 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-config-data-custom\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.741824 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-public-tls-certs\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.743142 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-config-data\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.743637 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46138c03-275f-46ea-b4d5-2947fcfe979c-config-data\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.743803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-internal-tls-certs\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.749313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-combined-ca-bundle\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.752436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn7kc\" (UniqueName: \"kubernetes.io/projected/a03aec86-430b-4209-8f15-f7fb97d58276-kube-api-access-rn7kc\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.753176 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-876jd\" (UniqueName: \"kubernetes.io/projected/46138c03-275f-46ea-b4d5-2947fcfe979c-kube-api-access-876jd\") pod \"heat-engine-7d5dfb5b8b-2znh8\" (UID: \"46138c03-275f-46ea-b4d5-2947fcfe979c\") " pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.762610 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a03aec86-430b-4209-8f15-f7fb97d58276-config-data-custom\") pod \"heat-api-58f688bc9b-9h9n4\" (UID: \"a03aec86-430b-4209-8f15-f7fb97d58276\") " pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.808099 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.844347 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-combined-ca-bundle\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.844942 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-config-data-custom\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.845133 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-public-tls-certs\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.845175 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-config-data\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.845372 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-internal-tls-certs\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.848438 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n6cj\" (UniqueName: \"kubernetes.io/projected/c8032f2a-13e3-4463-bab4-1b1d850e4b06-kube-api-access-2n6cj\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.848851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-combined-ca-bundle\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.850450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-config-data-custom\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.853325 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-internal-tls-certs\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.853774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-public-tls-certs\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.856033 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.873125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n6cj\" (UniqueName: \"kubernetes.io/projected/c8032f2a-13e3-4463-bab4-1b1d850e4b06-kube-api-access-2n6cj\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.880270 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8032f2a-13e3-4463-bab4-1b1d850e4b06-config-data\") pod \"heat-cfnapi-f8cb7866d-pk45f\" (UID: \"c8032f2a-13e3-4463-bab4-1b1d850e4b06\") " pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:04 crc kubenswrapper[4792]: I0318 16:03:04.934999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:05 crc kubenswrapper[4792]: I0318 16:03:05.478675 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7d5dfb5b8b-2znh8"] Mar 18 16:03:05 crc kubenswrapper[4792]: I0318 16:03:05.533386 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58f688bc9b-9h9n4"] Mar 18 16:03:05 crc kubenswrapper[4792]: I0318 16:03:05.626622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7d5dfb5b8b-2znh8" event={"ID":"46138c03-275f-46ea-b4d5-2947fcfe979c","Type":"ContainerStarted","Data":"a366ab5b19209c3d2a38b5ae9d16ffb5eb8c1e7211fa2ff4f3b5354039b0aae0"} Mar 18 16:03:05 crc kubenswrapper[4792]: I0318 16:03:05.629857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58f688bc9b-9h9n4" event={"ID":"a03aec86-430b-4209-8f15-f7fb97d58276","Type":"ContainerStarted","Data":"91b9c7c4ef09d705e82a807b5b6a051e84961efb94418f3b52c638b908386b40"} Mar 18 16:03:05 crc kubenswrapper[4792]: I0318 16:03:05.702707 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-f8cb7866d-pk45f"] Mar 18 16:03:05 crc kubenswrapper[4792]: I0318 16:03:05.855276 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:03:05 crc kubenswrapper[4792]: E0318 16:03:05.855741 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:03:05 crc kubenswrapper[4792]: I0318 16:03:05.881385 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6136483-c173-41ff-98e3-6c0472ee464f" path="/var/lib/kubelet/pods/e6136483-c173-41ff-98e3-6c0472ee464f/volumes" Mar 18 16:03:06 crc kubenswrapper[4792]: I0318 16:03:06.656311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-f8cb7866d-pk45f" event={"ID":"c8032f2a-13e3-4463-bab4-1b1d850e4b06","Type":"ContainerStarted","Data":"743f9bbe84968f0954cd06f306bddd1497568075af22c5035311fd9b59942653"} Mar 18 16:03:06 crc kubenswrapper[4792]: I0318 16:03:06.664085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7d5dfb5b8b-2znh8" event={"ID":"46138c03-275f-46ea-b4d5-2947fcfe979c","Type":"ContainerStarted","Data":"01c67d295053626a2b1268e5773609a22cbedc47e6d14354411a8007a99b9f37"} Mar 18 16:03:06 crc kubenswrapper[4792]: I0318 16:03:06.664330 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:06 crc kubenswrapper[4792]: I0318 16:03:06.692933 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7d5dfb5b8b-2znh8" podStartSLOduration=2.692886963 podStartE2EDuration="2.692886963s" podCreationTimestamp="2026-03-18 16:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:06.688781237 +0000 UTC m=+1735.558110174" watchObservedRunningTime="2026-03-18 16:03:06.692886963 +0000 UTC m=+1735.562215900" Mar 18 16:03:07 crc kubenswrapper[4792]: I0318 16:03:07.380953 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 16:03:10 crc kubenswrapper[4792]: I0318 16:03:10.715418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-f8cb7866d-pk45f" event={"ID":"c8032f2a-13e3-4463-bab4-1b1d850e4b06","Type":"ContainerStarted","Data":"1c7caec079d2b35e19cc0f4b99e4465a23a83f14cb2b880596bd45b61980e93d"} Mar 18 16:03:10 crc kubenswrapper[4792]: I0318 16:03:10.717865 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:10 crc kubenswrapper[4792]: I0318 16:03:10.720212 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58f688bc9b-9h9n4" event={"ID":"a03aec86-430b-4209-8f15-f7fb97d58276","Type":"ContainerStarted","Data":"1ffdd3b6907ca5d35ac8870287952a189621430cf352cd1a813da9e6437204f1"} Mar 18 16:03:10 crc kubenswrapper[4792]: I0318 16:03:10.721453 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:10 crc kubenswrapper[4792]: I0318 16:03:10.753673 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-f8cb7866d-pk45f" podStartSLOduration=3.074830851 podStartE2EDuration="6.753646847s" podCreationTimestamp="2026-03-18 16:03:04 +0000 UTC" firstStartedPulling="2026-03-18 16:03:05.702218758 +0000 UTC m=+1734.571547695" lastFinishedPulling="2026-03-18 16:03:09.381034754 +0000 UTC m=+1738.250363691" observedRunningTime="2026-03-18 16:03:10.743178053 +0000 UTC m=+1739.612507010" watchObservedRunningTime="2026-03-18 16:03:10.753646847 +0000 UTC m=+1739.622975784" Mar 18 16:03:10 crc kubenswrapper[4792]: I0318 16:03:10.782367 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-58f688bc9b-9h9n4" podStartSLOduration=2.935108811 podStartE2EDuration="6.782333164s" podCreationTimestamp="2026-03-18 16:03:04 +0000 UTC" firstStartedPulling="2026-03-18 16:03:05.530991884 +0000 UTC m=+1734.400320821" lastFinishedPulling="2026-03-18 16:03:09.378216237 +0000 UTC m=+1738.247545174" observedRunningTime="2026-03-18 16:03:10.770693154 +0000 UTC m=+1739.640022111" watchObservedRunningTime="2026-03-18 16:03:10.782333164 +0000 UTC m=+1739.651662101" Mar 18 16:03:16 crc kubenswrapper[4792]: I0318 16:03:16.555601 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-58f688bc9b-9h9n4" Mar 18 16:03:16 crc kubenswrapper[4792]: I0318 16:03:16.634902 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7d6bbd8cf5-xbjzv"] Mar 18 16:03:16 crc kubenswrapper[4792]: I0318 16:03:16.635209 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" podUID="7cbfe80e-2708-4672-aa17-bb5679fdc195" containerName="heat-api" containerID="cri-o://59c4dbb186863be5449bf6e56fcd3c162543a849829d5aa01e0ce0c735299742" gracePeriod=60 Mar 18 16:03:16 crc kubenswrapper[4792]: I0318 16:03:16.857751 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:03:16 crc kubenswrapper[4792]: E0318 16:03:16.858098 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.040759 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn"] Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.042775 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.046564 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.047430 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.047558 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.047607 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.059388 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn"] Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.123114 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.123210 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.123314 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnpb4\" (UniqueName: \"kubernetes.io/projected/c6394e15-1052-4fa6-9a74-bee5cce65ae7-kube-api-access-bnpb4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.123578 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.225695 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.225818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.225872 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.225951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnpb4\" (UniqueName: \"kubernetes.io/projected/c6394e15-1052-4fa6-9a74-bee5cce65ae7-kube-api-access-bnpb4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.237280 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.238772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.244631 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.248822 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnpb4\" (UniqueName: \"kubernetes.io/projected/c6394e15-1052-4fa6-9a74-bee5cce65ae7-kube-api-access-bnpb4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.297958 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-f8cb7866d-pk45f" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.379114 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.391774 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fbd9cff4c-khq7b"] Mar 18 16:03:17 crc kubenswrapper[4792]: I0318 16:03:17.392223 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" podUID="80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" containerName="heat-cfnapi" containerID="cri-o://7d8cc42d45a072a54555d56f0c1d99a4dd15f37a57417d38c220aac34a56b077" gracePeriod=60 Mar 18 16:03:18 crc kubenswrapper[4792]: I0318 16:03:18.351851 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn"] Mar 18 16:03:18 crc kubenswrapper[4792]: W0318 16:03:18.351999 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6394e15_1052_4fa6_9a74_bee5cce65ae7.slice/crio-a7cd3cbf3db576db4e85d67af9c4694a75e20ad2f766a27cb3dd261c8189cd5e WatchSource:0}: Error finding container a7cd3cbf3db576db4e85d67af9c4694a75e20ad2f766a27cb3dd261c8189cd5e: Status 404 returned error can't find the container with id a7cd3cbf3db576db4e85d67af9c4694a75e20ad2f766a27cb3dd261c8189cd5e Mar 18 16:03:18 crc kubenswrapper[4792]: I0318 16:03:18.881126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" event={"ID":"c6394e15-1052-4fa6-9a74-bee5cce65ae7","Type":"ContainerStarted","Data":"a7cd3cbf3db576db4e85d67af9c4694a75e20ad2f766a27cb3dd261c8189cd5e"} Mar 18 16:03:18 crc kubenswrapper[4792]: I0318 16:03:18.882979 4792 generic.go:334] "Generic (PLEG): container finished" podID="3a2e23b3-06c8-41e9-94d3-fa6fe815e906" containerID="350c8a7785782e97d0cca720cef383c6bdacd3971a89e9eb38fc9e1b22607900" exitCode=0 Mar 18 16:03:18 crc kubenswrapper[4792]: I0318 16:03:18.883025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"3a2e23b3-06c8-41e9-94d3-fa6fe815e906","Type":"ContainerDied","Data":"350c8a7785782e97d0cca720cef383c6bdacd3971a89e9eb38fc9e1b22607900"} Mar 18 16:03:18 crc kubenswrapper[4792]: I0318 16:03:18.888504 4792 generic.go:334] "Generic (PLEG): container finished" podID="24b920d2-ca55-4f2d-a313-06cbe39c81b8" containerID="8d96efcb967ef891f55ef50a32a4c69e0f368b2c210ec6d95bb2eca793bd0e7f" exitCode=0 Mar 18 16:03:18 crc kubenswrapper[4792]: I0318 16:03:18.888548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24b920d2-ca55-4f2d-a313-06cbe39c81b8","Type":"ContainerDied","Data":"8d96efcb967ef891f55ef50a32a4c69e0f368b2c210ec6d95bb2eca793bd0e7f"} Mar 18 16:03:19 crc kubenswrapper[4792]: I0318 16:03:19.922477 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"3a2e23b3-06c8-41e9-94d3-fa6fe815e906","Type":"ContainerStarted","Data":"1d36818c1550a4aa79fb0878c121f6a53090a0c7fa33f23d89de85f0c6cc3d68"} Mar 18 16:03:19 crc kubenswrapper[4792]: I0318 16:03:19.924562 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 18 16:03:19 crc kubenswrapper[4792]: I0318 16:03:19.938572 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"24b920d2-ca55-4f2d-a313-06cbe39c81b8","Type":"ContainerStarted","Data":"f5a8fc2bc5764ad2f2081dd3f848942d86530d0edacffb75bf9d0249fa83d926"} Mar 18 16:03:19 crc kubenswrapper[4792]: I0318 16:03:19.939754 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:03:20 crc kubenswrapper[4792]: I0318 16:03:20.065476 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.065458268 podStartE2EDuration="37.065458268s" podCreationTimestamp="2026-03-18 16:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:20.063525946 +0000 UTC m=+1748.932854873" watchObservedRunningTime="2026-03-18 16:03:20.065458268 +0000 UTC m=+1748.934787205" Mar 18 16:03:20 crc kubenswrapper[4792]: I0318 16:03:20.081683 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=37.081661172 podStartE2EDuration="37.081661172s" podCreationTimestamp="2026-03-18 16:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:03:19.985623143 +0000 UTC m=+1748.854952110" watchObservedRunningTime="2026-03-18 16:03:20.081661172 +0000 UTC m=+1748.950990109" Mar 18 16:03:20 crc kubenswrapper[4792]: I0318 16:03:20.366261 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" podUID="7cbfe80e-2708-4672-aa17-bb5679fdc195" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.237:8004/healthcheck\": read tcp 10.217.0.2:58726->10.217.0.237:8004: read: connection reset by peer" Mar 18 16:03:20 crc kubenswrapper[4792]: I0318 16:03:20.630721 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" podUID="80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.238:8000/healthcheck\": read tcp 10.217.0.2:33720->10.217.0.238:8000: read: connection reset by peer" Mar 18 16:03:20 crc kubenswrapper[4792]: I0318 16:03:20.981677 4792 generic.go:334] "Generic (PLEG): container finished" podID="80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" containerID="7d8cc42d45a072a54555d56f0c1d99a4dd15f37a57417d38c220aac34a56b077" exitCode=0 Mar 18 16:03:20 crc kubenswrapper[4792]: I0318 16:03:20.981740 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" event={"ID":"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731","Type":"ContainerDied","Data":"7d8cc42d45a072a54555d56f0c1d99a4dd15f37a57417d38c220aac34a56b077"} Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.004082 4792 generic.go:334] "Generic (PLEG): container finished" podID="7cbfe80e-2708-4672-aa17-bb5679fdc195" containerID="59c4dbb186863be5449bf6e56fcd3c162543a849829d5aa01e0ce0c735299742" exitCode=0 Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.004368 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" event={"ID":"7cbfe80e-2708-4672-aa17-bb5679fdc195","Type":"ContainerDied","Data":"59c4dbb186863be5449bf6e56fcd3c162543a849829d5aa01e0ce0c735299742"} Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.365355 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.457900 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data\") pod \"7cbfe80e-2708-4672-aa17-bb5679fdc195\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.458026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-combined-ca-bundle\") pod \"7cbfe80e-2708-4672-aa17-bb5679fdc195\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.458160 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-internal-tls-certs\") pod \"7cbfe80e-2708-4672-aa17-bb5679fdc195\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.458181 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96z9f\" (UniqueName: \"kubernetes.io/projected/7cbfe80e-2708-4672-aa17-bb5679fdc195-kube-api-access-96z9f\") pod \"7cbfe80e-2708-4672-aa17-bb5679fdc195\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.458237 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data-custom\") pod \"7cbfe80e-2708-4672-aa17-bb5679fdc195\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.458375 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-public-tls-certs\") pod \"7cbfe80e-2708-4672-aa17-bb5679fdc195\" (UID: \"7cbfe80e-2708-4672-aa17-bb5679fdc195\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.483187 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cbfe80e-2708-4672-aa17-bb5679fdc195-kube-api-access-96z9f" (OuterVolumeSpecName: "kube-api-access-96z9f") pod "7cbfe80e-2708-4672-aa17-bb5679fdc195" (UID: "7cbfe80e-2708-4672-aa17-bb5679fdc195"). InnerVolumeSpecName "kube-api-access-96z9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.492573 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7cbfe80e-2708-4672-aa17-bb5679fdc195" (UID: "7cbfe80e-2708-4672-aa17-bb5679fdc195"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.560346 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96z9f\" (UniqueName: \"kubernetes.io/projected/7cbfe80e-2708-4672-aa17-bb5679fdc195-kube-api-access-96z9f\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.560383 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.567132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cbfe80e-2708-4672-aa17-bb5679fdc195" (UID: "7cbfe80e-2708-4672-aa17-bb5679fdc195"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.586755 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data" (OuterVolumeSpecName: "config-data") pod "7cbfe80e-2708-4672-aa17-bb5679fdc195" (UID: "7cbfe80e-2708-4672-aa17-bb5679fdc195"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.586891 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7cbfe80e-2708-4672-aa17-bb5679fdc195" (UID: "7cbfe80e-2708-4672-aa17-bb5679fdc195"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.650108 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7cbfe80e-2708-4672-aa17-bb5679fdc195" (UID: "7cbfe80e-2708-4672-aa17-bb5679fdc195"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.663779 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.663830 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.663843 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.663859 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cbfe80e-2708-4672-aa17-bb5679fdc195-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.758715 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.766237 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-combined-ca-bundle\") pod \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.766499 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data-custom\") pod \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.766595 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-internal-tls-certs\") pod \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.766830 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ktbx\" (UniqueName: \"kubernetes.io/projected/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-kube-api-access-9ktbx\") pod \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.767029 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-public-tls-certs\") pod \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.767105 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data\") pod \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\" (UID: \"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731\") " Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.792177 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-kube-api-access-9ktbx" (OuterVolumeSpecName: "kube-api-access-9ktbx") pod "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" (UID: "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731"). InnerVolumeSpecName "kube-api-access-9ktbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.798943 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" (UID: "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.880788 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.880857 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ktbx\" (UniqueName: \"kubernetes.io/projected/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-kube-api-access-9ktbx\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.930210 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" (UID: "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.958137 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" (UID: "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.972234 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" (UID: "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.984833 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.984891 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:21 crc kubenswrapper[4792]: I0318 16:03:21.984907 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.019900 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" event={"ID":"80c3b2bc-2909-46a9-ac3a-e14d3eb9d731","Type":"ContainerDied","Data":"2e3e3c58b6078cc0280f9bda4afe2961948f26c9bcbe64c3935e0daefc384c4c"} Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.019960 4792 scope.go:117] "RemoveContainer" containerID="7d8cc42d45a072a54555d56f0c1d99a4dd15f37a57417d38c220aac34a56b077" Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.020185 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fbd9cff4c-khq7b" Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.034277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" event={"ID":"7cbfe80e-2708-4672-aa17-bb5679fdc195","Type":"ContainerDied","Data":"0f1f734df109dff5e2b5c0331a5bcd52e49d5dc4e56a397a68aaceb26db3a1f1"} Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.034406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7d6bbd8cf5-xbjzv" Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.048018 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data" (OuterVolumeSpecName: "config-data") pod "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" (UID: "80c3b2bc-2909-46a9-ac3a-e14d3eb9d731"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.066768 4792 scope.go:117] "RemoveContainer" containerID="59c4dbb186863be5449bf6e56fcd3c162543a849829d5aa01e0ce0c735299742" Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.087040 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7d6bbd8cf5-xbjzv"] Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.088673 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.103016 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7d6bbd8cf5-xbjzv"] Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.364456 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fbd9cff4c-khq7b"] Mar 18 16:03:22 crc kubenswrapper[4792]: I0318 16:03:22.377864 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6fbd9cff4c-khq7b"] Mar 18 16:03:23 crc kubenswrapper[4792]: I0318 16:03:23.869486 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cbfe80e-2708-4672-aa17-bb5679fdc195" path="/var/lib/kubelet/pods/7cbfe80e-2708-4672-aa17-bb5679fdc195/volumes" Mar 18 16:03:23 crc kubenswrapper[4792]: I0318 16:03:23.872561 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" path="/var/lib/kubelet/pods/80c3b2bc-2909-46a9-ac3a-e14d3eb9d731/volumes" Mar 18 16:03:24 crc kubenswrapper[4792]: I0318 16:03:24.852551 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7d5dfb5b8b-2znh8" Mar 18 16:03:25 crc kubenswrapper[4792]: I0318 16:03:25.008626 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6c99df75d9-tzn29"] Mar 18 16:03:25 crc kubenswrapper[4792]: I0318 16:03:25.008836 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6c99df75d9-tzn29" podUID="db801d44-72e6-44db-a478-e745ecf3d278" containerName="heat-engine" containerID="cri-o://daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" gracePeriod=60 Mar 18 16:03:28 crc kubenswrapper[4792]: I0318 16:03:28.949414 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-s667r"] Mar 18 16:03:28 crc kubenswrapper[4792]: I0318 16:03:28.968032 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-s667r"] Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.085013 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-vjjkg"] Mar 18 16:03:29 crc kubenswrapper[4792]: E0318 16:03:29.085671 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" containerName="heat-cfnapi" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.085691 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" containerName="heat-cfnapi" Mar 18 16:03:29 crc kubenswrapper[4792]: E0318 16:03:29.085731 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbfe80e-2708-4672-aa17-bb5679fdc195" containerName="heat-api" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.085740 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbfe80e-2708-4672-aa17-bb5679fdc195" containerName="heat-api" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.086055 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c3b2bc-2909-46a9-ac3a-e14d3eb9d731" containerName="heat-cfnapi" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.086079 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbfe80e-2708-4672-aa17-bb5679fdc195" containerName="heat-api" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.087143 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.089820 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.109330 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-vjjkg"] Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.185007 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-combined-ca-bundle\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.185629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-config-data\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.186033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-scripts\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.186539 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhq9q\" (UniqueName: \"kubernetes.io/projected/840a93f2-c524-4d8e-a761-8a075a9266da-kube-api-access-fhq9q\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.288813 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-config-data\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.288934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-scripts\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.289038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhq9q\" (UniqueName: \"kubernetes.io/projected/840a93f2-c524-4d8e-a761-8a075a9266da-kube-api-access-fhq9q\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.289080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-combined-ca-bundle\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.299305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-combined-ca-bundle\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.299592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-config-data\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.305911 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-scripts\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.311898 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhq9q\" (UniqueName: \"kubernetes.io/projected/840a93f2-c524-4d8e-a761-8a075a9266da-kube-api-access-fhq9q\") pod \"aodh-db-sync-vjjkg\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.442868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:29 crc kubenswrapper[4792]: I0318 16:03:29.872200 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18757a2-4c26-41be-9348-ea8624af2527" path="/var/lib/kubelet/pods/b18757a2-4c26-41be-9348-ea8624af2527/volumes" Mar 18 16:03:30 crc kubenswrapper[4792]: I0318 16:03:30.855542 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:03:30 crc kubenswrapper[4792]: E0318 16:03:30.855918 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:03:33 crc kubenswrapper[4792]: I0318 16:03:33.676231 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="3a2e23b3-06c8-41e9-94d3-fa6fe815e906" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.28:5671: connect: connection refused" Mar 18 16:03:33 crc kubenswrapper[4792]: I0318 16:03:33.999696 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 16:03:34 crc kubenswrapper[4792]: E0318 16:03:34.956139 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:03:34 crc kubenswrapper[4792]: E0318 16:03:34.959720 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:03:34 crc kubenswrapper[4792]: E0318 16:03:34.961371 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:03:34 crc kubenswrapper[4792]: E0318 16:03:34.961454 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6c99df75d9-tzn29" podUID="db801d44-72e6-44db-a478-e745ecf3d278" containerName="heat-engine" Mar 18 16:03:35 crc kubenswrapper[4792]: E0318 16:03:35.143307 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:18.0-fr5-latest" Mar 18 16:03:35 crc kubenswrapper[4792]: E0318 16:03:35.143473 4792 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 16:03:35 crc kubenswrapper[4792]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:18.0-fr5-latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Mar 18 16:03:35 crc kubenswrapper[4792]: - hosts: all Mar 18 16:03:35 crc kubenswrapper[4792]: strategy: linear Mar 18 16:03:35 crc kubenswrapper[4792]: tasks: Mar 18 16:03:35 crc kubenswrapper[4792]: - name: Enable podified-repos Mar 18 16:03:35 crc kubenswrapper[4792]: become: true Mar 18 16:03:35 crc kubenswrapper[4792]: ansible.builtin.shell: | Mar 18 16:03:35 crc kubenswrapper[4792]: set -euxo pipefail Mar 18 16:03:35 crc kubenswrapper[4792]: pushd /var/tmp Mar 18 16:03:35 crc kubenswrapper[4792]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Mar 18 16:03:35 crc kubenswrapper[4792]: pushd repo-setup-main Mar 18 16:03:35 crc kubenswrapper[4792]: python3 -m venv ./venv Mar 18 16:03:35 crc kubenswrapper[4792]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Mar 18 16:03:35 crc kubenswrapper[4792]: ./venv/bin/repo-setup current-podified -b antelope Mar 18 16:03:35 crc kubenswrapper[4792]: popd Mar 18 16:03:35 crc kubenswrapper[4792]: rm -rf repo-setup-main Mar 18 16:03:35 crc kubenswrapper[4792]: Mar 18 16:03:35 crc kubenswrapper[4792]: Mar 18 16:03:35 crc kubenswrapper[4792]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Mar 18 16:03:35 crc kubenswrapper[4792]: edpm_override_hosts: openstack-edpm-ipam Mar 18 16:03:35 crc kubenswrapper[4792]: edpm_service_type: repo-setup Mar 18 16:03:35 crc kubenswrapper[4792]: Mar 18 16:03:35 crc kubenswrapper[4792]: Mar 18 16:03:35 crc kubenswrapper[4792]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bnpb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn_openstack(c6394e15-1052-4fa6-9a74-bee5cce65ae7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 18 16:03:35 crc kubenswrapper[4792]: > logger="UnhandledError" Mar 18 16:03:35 crc kubenswrapper[4792]: E0318 16:03:35.149292 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" podUID="c6394e15-1052-4fa6-9a74-bee5cce65ae7" Mar 18 16:03:35 crc kubenswrapper[4792]: E0318 16:03:35.279235 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:18.0-fr5-latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" podUID="c6394e15-1052-4fa6-9a74-bee5cce65ae7" Mar 18 16:03:35 crc kubenswrapper[4792]: I0318 16:03:35.590324 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-vjjkg"] Mar 18 16:03:36 crc kubenswrapper[4792]: I0318 16:03:36.287776 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vjjkg" event={"ID":"840a93f2-c524-4d8e-a761-8a075a9266da","Type":"ContainerStarted","Data":"4926ec8d042c5c9d2e31b1ebb6aefb78b56c91e235d64ec4ab618b2f71a982c2"} Mar 18 16:03:42 crc kubenswrapper[4792]: I0318 16:03:42.692893 4792 scope.go:117] "RemoveContainer" containerID="9ddaf409b452419ceb06cf8585000096f662cf10bf6a11a5d59850cf54eba88c" Mar 18 16:03:42 crc kubenswrapper[4792]: I0318 16:03:42.854806 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:03:42 crc kubenswrapper[4792]: E0318 16:03:42.856623 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.376261 4792 scope.go:117] "RemoveContainer" containerID="c8f115ea38945674ba812fd56982b41e65a502103e68f5239b68b21bba217f9e" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.454832 4792 scope.go:117] "RemoveContainer" containerID="4f0591bb4e037d2141fc349b99e7b56de5eaeeb03c840b837da1ffde0a26b88a" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.629526 4792 scope.go:117] "RemoveContainer" containerID="e313a088eafeb6957bebebf6888575180f7655f78c16d6183ae9dcb439f10ee5" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.671149 4792 scope.go:117] "RemoveContainer" containerID="3e98eef60b92763f403fa2d548387d3ef4ea41ccf46420bd01da67c93ee7b5f9" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.682011 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.716378 4792 scope.go:117] "RemoveContainer" containerID="b05bf655c6c629d37d3415b4fb0c06026d40ddfe51fc5f4cfa3001c67e0d8277" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.762343 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.852485 4792 scope.go:117] "RemoveContainer" containerID="5e6d2747378d6eb32183d9d2e95f609701c942b1fa7a10ff138275778fcdc71e" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.908713 4792 scope.go:117] "RemoveContainer" containerID="4e1e12215aecbdfd4ece57b53e97d16d651e67fb481d33da31e7a578f9f6f5d5" Mar 18 16:03:43 crc kubenswrapper[4792]: I0318 16:03:43.948588 4792 scope.go:117] "RemoveContainer" containerID="2b3d50ec8af83e573a82f3981293eacc6e8aef89012c7ea6b8ed678f9d336d9b" Mar 18 16:03:44 crc kubenswrapper[4792]: I0318 16:03:44.434961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vjjkg" event={"ID":"840a93f2-c524-4d8e-a761-8a075a9266da","Type":"ContainerStarted","Data":"de33aa65296ff72106b1bb7f0106193224eb85bacd2274dfc3dcb8f3cc30e2dd"} Mar 18 16:03:44 crc kubenswrapper[4792]: E0318 16:03:44.971767 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:03:44 crc kubenswrapper[4792]: E0318 16:03:44.976119 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:03:44 crc kubenswrapper[4792]: E0318 16:03:44.981219 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 16:03:44 crc kubenswrapper[4792]: E0318 16:03:44.981297 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6c99df75d9-tzn29" podUID="db801d44-72e6-44db-a478-e745ecf3d278" containerName="heat-engine" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.496127 4792 generic.go:334] "Generic (PLEG): container finished" podID="db801d44-72e6-44db-a478-e745ecf3d278" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" exitCode=0 Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.497147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c99df75d9-tzn29" event={"ID":"db801d44-72e6-44db-a478-e745ecf3d278","Type":"ContainerDied","Data":"daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453"} Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.684779 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.705217 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data\") pod \"db801d44-72e6-44db-a478-e745ecf3d278\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.705284 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data-custom\") pod \"db801d44-72e6-44db-a478-e745ecf3d278\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.705343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkzwx\" (UniqueName: \"kubernetes.io/projected/db801d44-72e6-44db-a478-e745ecf3d278-kube-api-access-gkzwx\") pod \"db801d44-72e6-44db-a478-e745ecf3d278\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.705448 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-combined-ca-bundle\") pod \"db801d44-72e6-44db-a478-e745ecf3d278\" (UID: \"db801d44-72e6-44db-a478-e745ecf3d278\") " Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.715337 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "db801d44-72e6-44db-a478-e745ecf3d278" (UID: "db801d44-72e6-44db-a478-e745ecf3d278"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.716139 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-vjjkg" podStartSLOduration=8.874825211 podStartE2EDuration="16.716119564s" podCreationTimestamp="2026-03-18 16:03:29 +0000 UTC" firstStartedPulling="2026-03-18 16:03:35.591145934 +0000 UTC m=+1764.460474871" lastFinishedPulling="2026-03-18 16:03:43.432440297 +0000 UTC m=+1772.301769224" observedRunningTime="2026-03-18 16:03:44.461686446 +0000 UTC m=+1773.331015383" watchObservedRunningTime="2026-03-18 16:03:45.716119564 +0000 UTC m=+1774.585448501" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.725465 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db801d44-72e6-44db-a478-e745ecf3d278-kube-api-access-gkzwx" (OuterVolumeSpecName: "kube-api-access-gkzwx") pod "db801d44-72e6-44db-a478-e745ecf3d278" (UID: "db801d44-72e6-44db-a478-e745ecf3d278"). InnerVolumeSpecName "kube-api-access-gkzwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.783101 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db801d44-72e6-44db-a478-e745ecf3d278" (UID: "db801d44-72e6-44db-a478-e745ecf3d278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.810963 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkzwx\" (UniqueName: \"kubernetes.io/projected/db801d44-72e6-44db-a478-e745ecf3d278-kube-api-access-gkzwx\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.811024 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.811038 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.856283 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data" (OuterVolumeSpecName: "config-data") pod "db801d44-72e6-44db-a478-e745ecf3d278" (UID: "db801d44-72e6-44db-a478-e745ecf3d278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:45 crc kubenswrapper[4792]: I0318 16:03:45.916426 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db801d44-72e6-44db-a478-e745ecf3d278-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:46 crc kubenswrapper[4792]: I0318 16:03:46.511962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c99df75d9-tzn29" event={"ID":"db801d44-72e6-44db-a478-e745ecf3d278","Type":"ContainerDied","Data":"f9ab5c1eacc3fe53f1d604307fd9fe46328369e2896930b8e9427f269db48a62"} Mar 18 16:03:46 crc kubenswrapper[4792]: I0318 16:03:46.512035 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c99df75d9-tzn29" Mar 18 16:03:46 crc kubenswrapper[4792]: I0318 16:03:46.512043 4792 scope.go:117] "RemoveContainer" containerID="daf2d92d6158525bac89b8b24118c5b841a9f6b8f15f8c8814de9acb136c2453" Mar 18 16:03:46 crc kubenswrapper[4792]: I0318 16:03:46.549544 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6c99df75d9-tzn29"] Mar 18 16:03:46 crc kubenswrapper[4792]: I0318 16:03:46.567171 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6c99df75d9-tzn29"] Mar 18 16:03:47 crc kubenswrapper[4792]: I0318 16:03:47.871294 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db801d44-72e6-44db-a478-e745ecf3d278" path="/var/lib/kubelet/pods/db801d44-72e6-44db-a478-e745ecf3d278/volumes" Mar 18 16:03:48 crc kubenswrapper[4792]: I0318 16:03:48.538248 4792 generic.go:334] "Generic (PLEG): container finished" podID="840a93f2-c524-4d8e-a761-8a075a9266da" containerID="de33aa65296ff72106b1bb7f0106193224eb85bacd2274dfc3dcb8f3cc30e2dd" exitCode=0 Mar 18 16:03:48 crc kubenswrapper[4792]: I0318 16:03:48.538307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vjjkg" event={"ID":"840a93f2-c524-4d8e-a761-8a075a9266da","Type":"ContainerDied","Data":"de33aa65296ff72106b1bb7f0106193224eb85bacd2274dfc3dcb8f3cc30e2dd"} Mar 18 16:03:49 crc kubenswrapper[4792]: I0318 16:03:49.045102 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerName="rabbitmq" containerID="cri-o://4039bd52f24043dbbc782ccbdcbb41fe807871286bbdd4f16a6108b39598110c" gracePeriod=604795 Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.004528 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.118034 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-combined-ca-bundle\") pod \"840a93f2-c524-4d8e-a761-8a075a9266da\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.118089 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-config-data\") pod \"840a93f2-c524-4d8e-a761-8a075a9266da\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.118215 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhq9q\" (UniqueName: \"kubernetes.io/projected/840a93f2-c524-4d8e-a761-8a075a9266da-kube-api-access-fhq9q\") pod \"840a93f2-c524-4d8e-a761-8a075a9266da\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.118292 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-scripts\") pod \"840a93f2-c524-4d8e-a761-8a075a9266da\" (UID: \"840a93f2-c524-4d8e-a761-8a075a9266da\") " Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.124519 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-scripts" (OuterVolumeSpecName: "scripts") pod "840a93f2-c524-4d8e-a761-8a075a9266da" (UID: "840a93f2-c524-4d8e-a761-8a075a9266da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.140203 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840a93f2-c524-4d8e-a761-8a075a9266da-kube-api-access-fhq9q" (OuterVolumeSpecName: "kube-api-access-fhq9q") pod "840a93f2-c524-4d8e-a761-8a075a9266da" (UID: "840a93f2-c524-4d8e-a761-8a075a9266da"). InnerVolumeSpecName "kube-api-access-fhq9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.157790 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "840a93f2-c524-4d8e-a761-8a075a9266da" (UID: "840a93f2-c524-4d8e-a761-8a075a9266da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.159134 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-config-data" (OuterVolumeSpecName: "config-data") pod "840a93f2-c524-4d8e-a761-8a075a9266da" (UID: "840a93f2-c524-4d8e-a761-8a075a9266da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.221862 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.221901 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.221911 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhq9q\" (UniqueName: \"kubernetes.io/projected/840a93f2-c524-4d8e-a761-8a075a9266da-kube-api-access-fhq9q\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.221921 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/840a93f2-c524-4d8e-a761-8a075a9266da-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.563948 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vjjkg" event={"ID":"840a93f2-c524-4d8e-a761-8a075a9266da","Type":"ContainerDied","Data":"4926ec8d042c5c9d2e31b1ebb6aefb78b56c91e235d64ec4ab618b2f71a982c2"} Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.564227 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4926ec8d042c5c9d2e31b1ebb6aefb78b56c91e235d64ec4ab618b2f71a982c2" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.564010 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vjjkg" Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.566579 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" event={"ID":"c6394e15-1052-4fa6-9a74-bee5cce65ae7","Type":"ContainerStarted","Data":"4392a73650ebcd7887fb2955f0ce6473f01f2d15c7410cd7967d21437d99b394"} Mar 18 16:03:50 crc kubenswrapper[4792]: I0318 16:03:50.596664 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" podStartSLOduration=1.628911758 podStartE2EDuration="33.596638728s" podCreationTimestamp="2026-03-18 16:03:17 +0000 UTC" firstStartedPulling="2026-03-18 16:03:18.355447829 +0000 UTC m=+1747.224776766" lastFinishedPulling="2026-03-18 16:03:50.323174799 +0000 UTC m=+1779.192503736" observedRunningTime="2026-03-18 16:03:50.586302561 +0000 UTC m=+1779.455631498" watchObservedRunningTime="2026-03-18 16:03:50.596638728 +0000 UTC m=+1779.465967665" Mar 18 16:03:54 crc kubenswrapper[4792]: I0318 16:03:54.205458 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 16:03:54 crc kubenswrapper[4792]: I0318 16:03:54.206631 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-api" containerID="cri-o://339bc94021efac167fcbd3ffdb6eb353ea55450dd68041b222d8c6ff99fe4a3e" gracePeriod=30 Mar 18 16:03:54 crc kubenswrapper[4792]: I0318 16:03:54.207666 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-listener" containerID="cri-o://b216462437c6ae53381b77f30bb5220e18d1576fea7d99d9aaa310eb1a905c8b" gracePeriod=30 Mar 18 16:03:54 crc kubenswrapper[4792]: I0318 16:03:54.207733 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-notifier" containerID="cri-o://85faedf7edd2232b6dd3eb1b897f02c14706d3a065a10c7854bc795122383dbf" gracePeriod=30 Mar 18 16:03:54 crc kubenswrapper[4792]: I0318 16:03:54.207647 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-evaluator" containerID="cri-o://e1b40dff9f04b804d952bda6ef843bcf7d9f30a280bdfe8bdd16165dfe23d28e" gracePeriod=30 Mar 18 16:03:54 crc kubenswrapper[4792]: I0318 16:03:54.613058 4792 generic.go:334] "Generic (PLEG): container finished" podID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerID="339bc94021efac167fcbd3ffdb6eb353ea55450dd68041b222d8c6ff99fe4a3e" exitCode=0 Mar 18 16:03:54 crc kubenswrapper[4792]: I0318 16:03:54.613134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerDied","Data":"339bc94021efac167fcbd3ffdb6eb353ea55450dd68041b222d8c6ff99fe4a3e"} Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.651581 4792 generic.go:334] "Generic (PLEG): container finished" podID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerID="4039bd52f24043dbbc782ccbdcbb41fe807871286bbdd4f16a6108b39598110c" exitCode=0 Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.651690 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"753d5ec4-134d-48f9-ad6c-aa17f8856b5a","Type":"ContainerDied","Data":"4039bd52f24043dbbc782ccbdcbb41fe807871286bbdd4f16a6108b39598110c"} Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.675898 4792 generic.go:334] "Generic (PLEG): container finished" podID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerID="85faedf7edd2232b6dd3eb1b897f02c14706d3a065a10c7854bc795122383dbf" exitCode=0 Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.675950 4792 generic.go:334] "Generic (PLEG): container finished" podID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerID="e1b40dff9f04b804d952bda6ef843bcf7d9f30a280bdfe8bdd16165dfe23d28e" exitCode=0 Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.675990 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerDied","Data":"85faedf7edd2232b6dd3eb1b897f02c14706d3a065a10c7854bc795122383dbf"} Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.676022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerDied","Data":"e1b40dff9f04b804d952bda6ef843bcf7d9f30a280bdfe8bdd16165dfe23d28e"} Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.750100 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.763571 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-confd\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766118 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-erlang-cookie\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766229 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-pod-info\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-config-data\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766345 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-tls\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766434 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-server-conf\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766522 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-plugins-conf\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766609 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-erlang-cookie-secret\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766642 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5l95\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-kube-api-access-f5l95\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.766694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-plugins\") pod \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\" (UID: \"753d5ec4-134d-48f9-ad6c-aa17f8856b5a\") " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.768599 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.768635 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.772328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.772531 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.773489 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-kube-api-access-f5l95" (OuterVolumeSpecName: "kube-api-access-f5l95") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "kube-api-access-f5l95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.773862 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.783319 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-pod-info" (OuterVolumeSpecName: "pod-info") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.872655 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.872689 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.872704 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.872721 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.872733 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.872745 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5l95\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-kube-api-access-f5l95\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.872758 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.899662 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-config-data" (OuterVolumeSpecName: "config-data") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.901935 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad" (OuterVolumeSpecName: "persistence") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.917768 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-server-conf" (OuterVolumeSpecName: "server-conf") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.978700 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") on node \"crc\" " Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.978769 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:55.978785 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.000286 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "753d5ec4-134d-48f9-ad6c-aa17f8856b5a" (UID: "753d5ec4-134d-48f9-ad6c-aa17f8856b5a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.040714 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.041481 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad") on node "crc" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.080800 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/753d5ec4-134d-48f9-ad6c-aa17f8856b5a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.080840 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.721661 4792 generic.go:334] "Generic (PLEG): container finished" podID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerID="b216462437c6ae53381b77f30bb5220e18d1576fea7d99d9aaa310eb1a905c8b" exitCode=0 Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.721734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerDied","Data":"b216462437c6ae53381b77f30bb5220e18d1576fea7d99d9aaa310eb1a905c8b"} Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.724354 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"753d5ec4-134d-48f9-ad6c-aa17f8856b5a","Type":"ContainerDied","Data":"3c6dea45e282aa58d2079f7b2a2e49d09bf9a589d398a47886cdab30f0b4d4ba"} Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.724399 4792 scope.go:117] "RemoveContainer" containerID="4039bd52f24043dbbc782ccbdcbb41fe807871286bbdd4f16a6108b39598110c" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.724552 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.841295 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.842430 4792 scope.go:117] "RemoveContainer" containerID="7782a9ac2dbdefa63ef4731235464b9470b5ae66917c638938707289f3e36397" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.855207 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.855544 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.876142 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.905072 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.945693 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946368 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerName="setup-container" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946399 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerName="setup-container" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946421 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840a93f2-c524-4d8e-a761-8a075a9266da" containerName="aodh-db-sync" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946430 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="840a93f2-c524-4d8e-a761-8a075a9266da" containerName="aodh-db-sync" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946445 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-listener" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946453 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-listener" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946469 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db801d44-72e6-44db-a478-e745ecf3d278" containerName="heat-engine" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946476 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="db801d44-72e6-44db-a478-e745ecf3d278" containerName="heat-engine" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946495 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-evaluator" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946502 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-evaluator" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946520 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-api" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946527 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-api" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946553 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-notifier" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946561 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-notifier" Mar 18 16:03:56 crc kubenswrapper[4792]: E0318 16:03:56.946587 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerName="rabbitmq" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946595 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerName="rabbitmq" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946885 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" containerName="rabbitmq" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946904 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="db801d44-72e6-44db-a478-e745ecf3d278" containerName="heat-engine" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946915 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-notifier" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946941 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-evaluator" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946959 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="840a93f2-c524-4d8e-a761-8a075a9266da" containerName="aodh-db-sync" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.946990 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-listener" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.947002 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" containerName="aodh-api" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.948652 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 16:03:56 crc kubenswrapper[4792]: I0318 16:03:56.959703 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.003356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xtsg\" (UniqueName: \"kubernetes.io/projected/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-kube-api-access-2xtsg\") pod \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.003625 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-public-tls-certs\") pod \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.003672 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-scripts\") pod \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.003692 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-config-data\") pod \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.003715 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-combined-ca-bundle\") pod \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.003775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-internal-tls-certs\") pod \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\" (UID: \"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba\") " Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.012918 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-scripts" (OuterVolumeSpecName: "scripts") pod "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" (UID: "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.021637 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-kube-api-access-2xtsg" (OuterVolumeSpecName: "kube-api-access-2xtsg") pod "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" (UID: "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba"). InnerVolumeSpecName "kube-api-access-2xtsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.095503 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" (UID: "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107295 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107522 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-config-data\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-pod-info\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107826 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-server-conf\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plrnt\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-kube-api-access-plrnt\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.107962 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.108082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.108250 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.108273 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.108285 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xtsg\" (UniqueName: \"kubernetes.io/projected/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-kube-api-access-2xtsg\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.141152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" (UID: "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.185648 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" (UID: "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210091 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210345 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210446 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-config-data\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210711 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-pod-info\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210875 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-server-conf\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.210950 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plrnt\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-kube-api-access-plrnt\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.211065 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.211187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.211327 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.211384 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.211886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.212400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.212552 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.212635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-config-data\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.213248 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-server-conf\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.215757 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.215886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-pod-info\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.216524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.217059 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.220731 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.220797 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e806a32f563ffb605d360d449207970d828596911ba7075052f9c981032e8d8/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.223924 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-config-data" (OuterVolumeSpecName: "config-data") pod "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" (UID: "fcb5e796-60f0-44bf-b5bc-961f71fdd1ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.231942 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plrnt\" (UniqueName: \"kubernetes.io/projected/b2a1ad0b-1684-4f7b-a7f0-023c7a15286a-kube-api-access-plrnt\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.313479 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.334297 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-560e676c-7fb6-4e9f-8d3d-b1230185f8ad\") pod \"rabbitmq-server-1\" (UID: \"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a\") " pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.578409 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.782411 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fcb5e796-60f0-44bf-b5bc-961f71fdd1ba","Type":"ContainerDied","Data":"e4242b2a6738b067d7a1737c3e37d02b65813652b87fad1065fb6264ef95b129"} Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.782472 4792 scope.go:117] "RemoveContainer" containerID="b216462437c6ae53381b77f30bb5220e18d1576fea7d99d9aaa310eb1a905c8b" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.782518 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.845044 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.872156 4792 scope.go:117] "RemoveContainer" containerID="85faedf7edd2232b6dd3eb1b897f02c14706d3a065a10c7854bc795122383dbf" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.905285 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753d5ec4-134d-48f9-ad6c-aa17f8856b5a" path="/var/lib/kubelet/pods/753d5ec4-134d-48f9-ad6c-aa17f8856b5a/volumes" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.916600 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.916655 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.924143 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.924561 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.929702 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.930080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.930286 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.931016 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5q5xg" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.940924 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 16:03:57 crc kubenswrapper[4792]: I0318 16:03:57.964073 4792 scope.go:117] "RemoveContainer" containerID="e1b40dff9f04b804d952bda6ef843bcf7d9f30a280bdfe8bdd16165dfe23d28e" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.006376 4792 scope.go:117] "RemoveContainer" containerID="339bc94021efac167fcbd3ffdb6eb353ea55450dd68041b222d8c6ff99fe4a3e" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.034343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-config-data\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.034422 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.034593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r6vs\" (UniqueName: \"kubernetes.io/projected/e51032f9-f6e1-4f72-9185-784c3acae24b-kube-api-access-5r6vs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.034686 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-internal-tls-certs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.034761 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-public-tls-certs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.034819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-scripts\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.136995 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-internal-tls-certs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.137329 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-public-tls-certs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.137367 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-scripts\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.137521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-config-data\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.137580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.137627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r6vs\" (UniqueName: \"kubernetes.io/projected/e51032f9-f6e1-4f72-9185-784c3acae24b-kube-api-access-5r6vs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.145181 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.146097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-scripts\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.156293 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-config-data\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.160529 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-public-tls-certs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.162353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r6vs\" (UniqueName: \"kubernetes.io/projected/e51032f9-f6e1-4f72-9185-784c3acae24b-kube-api-access-5r6vs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.176194 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51032f9-f6e1-4f72-9185-784c3acae24b-internal-tls-certs\") pod \"aodh-0\" (UID: \"e51032f9-f6e1-4f72-9185-784c3acae24b\") " pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.307123 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.435416 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.810465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a","Type":"ContainerStarted","Data":"5e4a6ded64095c51dcde109b1fe7395bf2cfbc28d8e3810dff5eb3ce13072b65"} Mar 18 16:03:58 crc kubenswrapper[4792]: I0318 16:03:58.833381 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 16:03:59 crc kubenswrapper[4792]: I0318 16:03:59.822897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e51032f9-f6e1-4f72-9185-784c3acae24b","Type":"ContainerStarted","Data":"2c7497c95020da113eff8d088b8b5de016ecdca4c4a9dacdb8836a3e5f14f60c"} Mar 18 16:03:59 crc kubenswrapper[4792]: I0318 16:03:59.823434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e51032f9-f6e1-4f72-9185-784c3acae24b","Type":"ContainerStarted","Data":"6626261f1fd993c0f13845db560705259c0c01197ac65681e82c77e456a7b4ec"} Mar 18 16:03:59 crc kubenswrapper[4792]: I0318 16:03:59.875453 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb5e796-60f0-44bf-b5bc-961f71fdd1ba" path="/var/lib/kubelet/pods/fcb5e796-60f0-44bf-b5bc-961f71fdd1ba/volumes" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.160511 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564164-84j7r"] Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.162557 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-84j7r" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.165405 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.165409 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.165503 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.174876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-84j7r"] Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.306931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpcb\" (UniqueName: \"kubernetes.io/projected/67989f84-d392-47e7-8358-37830b7dcaee-kube-api-access-5tpcb\") pod \"auto-csr-approver-29564164-84j7r\" (UID: \"67989f84-d392-47e7-8358-37830b7dcaee\") " pod="openshift-infra/auto-csr-approver-29564164-84j7r" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.408977 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpcb\" (UniqueName: \"kubernetes.io/projected/67989f84-d392-47e7-8358-37830b7dcaee-kube-api-access-5tpcb\") pod \"auto-csr-approver-29564164-84j7r\" (UID: \"67989f84-d392-47e7-8358-37830b7dcaee\") " pod="openshift-infra/auto-csr-approver-29564164-84j7r" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.432937 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpcb\" (UniqueName: \"kubernetes.io/projected/67989f84-d392-47e7-8358-37830b7dcaee-kube-api-access-5tpcb\") pod \"auto-csr-approver-29564164-84j7r\" (UID: \"67989f84-d392-47e7-8358-37830b7dcaee\") " pod="openshift-infra/auto-csr-approver-29564164-84j7r" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.497015 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-84j7r" Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.840385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a","Type":"ContainerStarted","Data":"e9db7c5e68e8c8101cdedf445fd783ab61a801457b6134af218b9827eccb0ef5"} Mar 18 16:04:00 crc kubenswrapper[4792]: I0318 16:04:00.845151 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e51032f9-f6e1-4f72-9185-784c3acae24b","Type":"ContainerStarted","Data":"7a9d57a247297df37ce908abbfb9ecf579479d44adc11192bc675f477611bfde"} Mar 18 16:04:01 crc kubenswrapper[4792]: I0318 16:04:01.000986 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-84j7r"] Mar 18 16:04:01 crc kubenswrapper[4792]: W0318 16:04:01.061350 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67989f84_d392_47e7_8358_37830b7dcaee.slice/crio-cc69021925ee370ff76a36daa908215adffbbb051080ccef79863b83dd5b4a3d WatchSource:0}: Error finding container cc69021925ee370ff76a36daa908215adffbbb051080ccef79863b83dd5b4a3d: Status 404 returned error can't find the container with id cc69021925ee370ff76a36daa908215adffbbb051080ccef79863b83dd5b4a3d Mar 18 16:04:01 crc kubenswrapper[4792]: I0318 16:04:01.890367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-84j7r" event={"ID":"67989f84-d392-47e7-8358-37830b7dcaee","Type":"ContainerStarted","Data":"cc69021925ee370ff76a36daa908215adffbbb051080ccef79863b83dd5b4a3d"} Mar 18 16:04:01 crc kubenswrapper[4792]: I0318 16:04:01.897047 4792 generic.go:334] "Generic (PLEG): container finished" podID="c6394e15-1052-4fa6-9a74-bee5cce65ae7" containerID="4392a73650ebcd7887fb2955f0ce6473f01f2d15c7410cd7967d21437d99b394" exitCode=0 Mar 18 16:04:01 crc kubenswrapper[4792]: I0318 16:04:01.898624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" event={"ID":"c6394e15-1052-4fa6-9a74-bee5cce65ae7","Type":"ContainerDied","Data":"4392a73650ebcd7887fb2955f0ce6473f01f2d15c7410cd7967d21437d99b394"} Mar 18 16:04:02 crc kubenswrapper[4792]: I0318 16:04:02.921944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-84j7r" event={"ID":"67989f84-d392-47e7-8358-37830b7dcaee","Type":"ContainerStarted","Data":"9f6fece026c8d22796dd3944c910c620335f81930fd9b88315b465103fe800cd"} Mar 18 16:04:02 crc kubenswrapper[4792]: I0318 16:04:02.927254 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e51032f9-f6e1-4f72-9185-784c3acae24b","Type":"ContainerStarted","Data":"c561cad4395dc8ad173afa154c72b99df2b4cbc44b6a78aad9484f5354f3f30f"} Mar 18 16:04:02 crc kubenswrapper[4792]: I0318 16:04:02.944086 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564164-84j7r" podStartSLOduration=1.803088555 podStartE2EDuration="2.944067252s" podCreationTimestamp="2026-03-18 16:04:00 +0000 UTC" firstStartedPulling="2026-03-18 16:04:01.065748682 +0000 UTC m=+1789.935077609" lastFinishedPulling="2026-03-18 16:04:02.206727369 +0000 UTC m=+1791.076056306" observedRunningTime="2026-03-18 16:04:02.937145603 +0000 UTC m=+1791.806474540" watchObservedRunningTime="2026-03-18 16:04:02.944067252 +0000 UTC m=+1791.813396189" Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.812449 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.934466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-ssh-key-openstack-edpm-ipam\") pod \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.934542 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-inventory\") pod \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.934591 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnpb4\" (UniqueName: \"kubernetes.io/projected/c6394e15-1052-4fa6-9a74-bee5cce65ae7-kube-api-access-bnpb4\") pod \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.934629 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-repo-setup-combined-ca-bundle\") pod \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\" (UID: \"c6394e15-1052-4fa6-9a74-bee5cce65ae7\") " Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.939413 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c6394e15-1052-4fa6-9a74-bee5cce65ae7" (UID: "c6394e15-1052-4fa6-9a74-bee5cce65ae7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.940749 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6394e15-1052-4fa6-9a74-bee5cce65ae7-kube-api-access-bnpb4" (OuterVolumeSpecName: "kube-api-access-bnpb4") pod "c6394e15-1052-4fa6-9a74-bee5cce65ae7" (UID: "c6394e15-1052-4fa6-9a74-bee5cce65ae7"). InnerVolumeSpecName "kube-api-access-bnpb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.940848 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.981402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c6394e15-1052-4fa6-9a74-bee5cce65ae7" (UID: "c6394e15-1052-4fa6-9a74-bee5cce65ae7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:03 crc kubenswrapper[4792]: I0318 16:04:03.982082 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-inventory" (OuterVolumeSpecName: "inventory") pod "c6394e15-1052-4fa6-9a74-bee5cce65ae7" (UID: "c6394e15-1052-4fa6-9a74-bee5cce65ae7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.040143 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.040182 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.040206 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnpb4\" (UniqueName: \"kubernetes.io/projected/c6394e15-1052-4fa6-9a74-bee5cce65ae7-kube-api-access-bnpb4\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.040219 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6394e15-1052-4fa6-9a74-bee5cce65ae7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.043311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn" event={"ID":"c6394e15-1052-4fa6-9a74-bee5cce65ae7","Type":"ContainerDied","Data":"a7cd3cbf3db576db4e85d67af9c4694a75e20ad2f766a27cb3dd261c8189cd5e"} Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.043361 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7cd3cbf3db576db4e85d67af9c4694a75e20ad2f766a27cb3dd261c8189cd5e" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.093987 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz"] Mar 18 16:04:04 crc kubenswrapper[4792]: E0318 16:04:04.094674 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6394e15-1052-4fa6-9a74-bee5cce65ae7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.094691 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6394e15-1052-4fa6-9a74-bee5cce65ae7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.095041 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6394e15-1052-4fa6-9a74-bee5cce65ae7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.096079 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.135067 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz"] Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.245611 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.245681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wprjb\" (UniqueName: \"kubernetes.io/projected/60e44911-c925-4e0a-bdb7-849994798535-kube-api-access-wprjb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.246005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.348638 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.348694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wprjb\" (UniqueName: \"kubernetes.io/projected/60e44911-c925-4e0a-bdb7-849994798535-kube-api-access-wprjb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.349017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.353444 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.354712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.366480 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wprjb\" (UniqueName: \"kubernetes.io/projected/60e44911-c925-4e0a-bdb7-849994798535-kube-api-access-wprjb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bfckz\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.436114 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.953698 4792 generic.go:334] "Generic (PLEG): container finished" podID="67989f84-d392-47e7-8358-37830b7dcaee" containerID="9f6fece026c8d22796dd3944c910c620335f81930fd9b88315b465103fe800cd" exitCode=0 Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.953783 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-84j7r" event={"ID":"67989f84-d392-47e7-8358-37830b7dcaee","Type":"ContainerDied","Data":"9f6fece026c8d22796dd3944c910c620335f81930fd9b88315b465103fe800cd"} Mar 18 16:04:04 crc kubenswrapper[4792]: I0318 16:04:04.959175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e51032f9-f6e1-4f72-9185-784c3acae24b","Type":"ContainerStarted","Data":"d9cf4080dd875d18fec84c4f90c17e774be8245919369c16e9b2bac57d86bd7f"} Mar 18 16:04:05 crc kubenswrapper[4792]: I0318 16:04:05.028233 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.027645252 podStartE2EDuration="8.028196326s" podCreationTimestamp="2026-03-18 16:03:57 +0000 UTC" firstStartedPulling="2026-03-18 16:03:58.842677289 +0000 UTC m=+1787.712006216" lastFinishedPulling="2026-03-18 16:04:03.843228363 +0000 UTC m=+1792.712557290" observedRunningTime="2026-03-18 16:04:04.995809468 +0000 UTC m=+1793.865138405" watchObservedRunningTime="2026-03-18 16:04:05.028196326 +0000 UTC m=+1793.897525273" Mar 18 16:04:05 crc kubenswrapper[4792]: W0318 16:04:05.133119 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e44911_c925_4e0a_bdb7_849994798535.slice/crio-c4d17170299b9522b9761803e26466e15fd2eda0557be19667203d7b052f8f60 WatchSource:0}: Error finding container c4d17170299b9522b9761803e26466e15fd2eda0557be19667203d7b052f8f60: Status 404 returned error can't find the container with id c4d17170299b9522b9761803e26466e15fd2eda0557be19667203d7b052f8f60 Mar 18 16:04:05 crc kubenswrapper[4792]: I0318 16:04:05.161736 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz"] Mar 18 16:04:05 crc kubenswrapper[4792]: I0318 16:04:05.973746 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" event={"ID":"60e44911-c925-4e0a-bdb7-849994798535","Type":"ContainerStarted","Data":"4c98a6ecbfd0c8bc0be44feb57de5b836466f6691f78b8668799b5b11d99b253"} Mar 18 16:04:05 crc kubenswrapper[4792]: I0318 16:04:05.974092 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" event={"ID":"60e44911-c925-4e0a-bdb7-849994798535","Type":"ContainerStarted","Data":"c4d17170299b9522b9761803e26466e15fd2eda0557be19667203d7b052f8f60"} Mar 18 16:04:05 crc kubenswrapper[4792]: I0318 16:04:05.994490 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" podStartSLOduration=1.45898548 podStartE2EDuration="1.994475747s" podCreationTimestamp="2026-03-18 16:04:04 +0000 UTC" firstStartedPulling="2026-03-18 16:04:05.137215736 +0000 UTC m=+1794.006544673" lastFinishedPulling="2026-03-18 16:04:05.672706003 +0000 UTC m=+1794.542034940" observedRunningTime="2026-03-18 16:04:05.992485444 +0000 UTC m=+1794.861814381" watchObservedRunningTime="2026-03-18 16:04:05.994475747 +0000 UTC m=+1794.863804684" Mar 18 16:04:06 crc kubenswrapper[4792]: I0318 16:04:06.420497 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-84j7r" Mar 18 16:04:06 crc kubenswrapper[4792]: I0318 16:04:06.518603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tpcb\" (UniqueName: \"kubernetes.io/projected/67989f84-d392-47e7-8358-37830b7dcaee-kube-api-access-5tpcb\") pod \"67989f84-d392-47e7-8358-37830b7dcaee\" (UID: \"67989f84-d392-47e7-8358-37830b7dcaee\") " Mar 18 16:04:06 crc kubenswrapper[4792]: I0318 16:04:06.523494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67989f84-d392-47e7-8358-37830b7dcaee-kube-api-access-5tpcb" (OuterVolumeSpecName: "kube-api-access-5tpcb") pod "67989f84-d392-47e7-8358-37830b7dcaee" (UID: "67989f84-d392-47e7-8358-37830b7dcaee"). InnerVolumeSpecName "kube-api-access-5tpcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:06 crc kubenswrapper[4792]: I0318 16:04:06.623354 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tpcb\" (UniqueName: \"kubernetes.io/projected/67989f84-d392-47e7-8358-37830b7dcaee-kube-api-access-5tpcb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:06 crc kubenswrapper[4792]: I0318 16:04:06.987304 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564164-84j7r" Mar 18 16:04:06 crc kubenswrapper[4792]: I0318 16:04:06.987331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564164-84j7r" event={"ID":"67989f84-d392-47e7-8358-37830b7dcaee","Type":"ContainerDied","Data":"cc69021925ee370ff76a36daa908215adffbbb051080ccef79863b83dd5b4a3d"} Mar 18 16:04:06 crc kubenswrapper[4792]: I0318 16:04:06.987394 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc69021925ee370ff76a36daa908215adffbbb051080ccef79863b83dd5b4a3d" Mar 18 16:04:07 crc kubenswrapper[4792]: I0318 16:04:07.038631 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-5fcfc"] Mar 18 16:04:07 crc kubenswrapper[4792]: I0318 16:04:07.051432 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564158-5fcfc"] Mar 18 16:04:07 crc kubenswrapper[4792]: I0318 16:04:07.868905 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d587f36a-ade4-499c-b160-673d58efb861" path="/var/lib/kubelet/pods/d587f36a-ade4-499c-b160-673d58efb861/volumes" Mar 18 16:04:09 crc kubenswrapper[4792]: I0318 16:04:09.017621 4792 generic.go:334] "Generic (PLEG): container finished" podID="60e44911-c925-4e0a-bdb7-849994798535" containerID="4c98a6ecbfd0c8bc0be44feb57de5b836466f6691f78b8668799b5b11d99b253" exitCode=0 Mar 18 16:04:09 crc kubenswrapper[4792]: I0318 16:04:09.019105 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" event={"ID":"60e44911-c925-4e0a-bdb7-849994798535","Type":"ContainerDied","Data":"4c98a6ecbfd0c8bc0be44feb57de5b836466f6691f78b8668799b5b11d99b253"} Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.592488 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.754116 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-ssh-key-openstack-edpm-ipam\") pod \"60e44911-c925-4e0a-bdb7-849994798535\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.754262 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wprjb\" (UniqueName: \"kubernetes.io/projected/60e44911-c925-4e0a-bdb7-849994798535-kube-api-access-wprjb\") pod \"60e44911-c925-4e0a-bdb7-849994798535\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.754310 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-inventory\") pod \"60e44911-c925-4e0a-bdb7-849994798535\" (UID: \"60e44911-c925-4e0a-bdb7-849994798535\") " Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.760237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e44911-c925-4e0a-bdb7-849994798535-kube-api-access-wprjb" (OuterVolumeSpecName: "kube-api-access-wprjb") pod "60e44911-c925-4e0a-bdb7-849994798535" (UID: "60e44911-c925-4e0a-bdb7-849994798535"). InnerVolumeSpecName "kube-api-access-wprjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.787185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-inventory" (OuterVolumeSpecName: "inventory") pod "60e44911-c925-4e0a-bdb7-849994798535" (UID: "60e44911-c925-4e0a-bdb7-849994798535"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.793805 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "60e44911-c925-4e0a-bdb7-849994798535" (UID: "60e44911-c925-4e0a-bdb7-849994798535"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.859070 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.859121 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wprjb\" (UniqueName: \"kubernetes.io/projected/60e44911-c925-4e0a-bdb7-849994798535-kube-api-access-wprjb\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:10 crc kubenswrapper[4792]: I0318 16:04:10.859133 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e44911-c925-4e0a-bdb7-849994798535-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.045571 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" event={"ID":"60e44911-c925-4e0a-bdb7-849994798535","Type":"ContainerDied","Data":"c4d17170299b9522b9761803e26466e15fd2eda0557be19667203d7b052f8f60"} Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.045616 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d17170299b9522b9761803e26466e15fd2eda0557be19667203d7b052f8f60" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.045683 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bfckz" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.144788 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc"] Mar 18 16:04:11 crc kubenswrapper[4792]: E0318 16:04:11.145420 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67989f84-d392-47e7-8358-37830b7dcaee" containerName="oc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.145438 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="67989f84-d392-47e7-8358-37830b7dcaee" containerName="oc" Mar 18 16:04:11 crc kubenswrapper[4792]: E0318 16:04:11.145452 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e44911-c925-4e0a-bdb7-849994798535" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.145461 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e44911-c925-4e0a-bdb7-849994798535" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.146360 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="67989f84-d392-47e7-8358-37830b7dcaee" containerName="oc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.146414 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e44911-c925-4e0a-bdb7-849994798535" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.151247 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.154278 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.154425 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.154458 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.154561 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.191046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc"] Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.284368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.284466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.284564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.284706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dhl\" (UniqueName: \"kubernetes.io/projected/e8427835-8b71-4705-91e2-d82092ec93f5-kube-api-access-24dhl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.387313 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.387856 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.387992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.388099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24dhl\" (UniqueName: \"kubernetes.io/projected/e8427835-8b71-4705-91e2-d82092ec93f5-kube-api-access-24dhl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.391577 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.391848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.398146 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.412921 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dhl\" (UniqueName: \"kubernetes.io/projected/e8427835-8b71-4705-91e2-d82092ec93f5-kube-api-access-24dhl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.477401 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:04:11 crc kubenswrapper[4792]: I0318 16:04:11.871210 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:04:11 crc kubenswrapper[4792]: E0318 16:04:11.875096 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:04:12 crc kubenswrapper[4792]: W0318 16:04:12.097690 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8427835_8b71_4705_91e2_d82092ec93f5.slice/crio-5932d0269dc958873772095a842a6564a6a9ac7b4bd9d538d8b3b0e03ef94a30 WatchSource:0}: Error finding container 5932d0269dc958873772095a842a6564a6a9ac7b4bd9d538d8b3b0e03ef94a30: Status 404 returned error can't find the container with id 5932d0269dc958873772095a842a6564a6a9ac7b4bd9d538d8b3b0e03ef94a30 Mar 18 16:04:12 crc kubenswrapper[4792]: I0318 16:04:12.099328 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc"] Mar 18 16:04:12 crc kubenswrapper[4792]: I0318 16:04:12.609759 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:04:13 crc kubenswrapper[4792]: I0318 16:04:13.076857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" event={"ID":"e8427835-8b71-4705-91e2-d82092ec93f5","Type":"ContainerStarted","Data":"da65379d15e79987c043376e5bde69dbd95488563a7f890e262fda97d1f83cb3"} Mar 18 16:04:13 crc kubenswrapper[4792]: I0318 16:04:13.077176 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" event={"ID":"e8427835-8b71-4705-91e2-d82092ec93f5","Type":"ContainerStarted","Data":"5932d0269dc958873772095a842a6564a6a9ac7b4bd9d538d8b3b0e03ef94a30"} Mar 18 16:04:13 crc kubenswrapper[4792]: I0318 16:04:13.117952 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" podStartSLOduration=1.613470472 podStartE2EDuration="2.117928824s" podCreationTimestamp="2026-03-18 16:04:11 +0000 UTC" firstStartedPulling="2026-03-18 16:04:12.102271586 +0000 UTC m=+1800.971600523" lastFinishedPulling="2026-03-18 16:04:12.606729948 +0000 UTC m=+1801.476058875" observedRunningTime="2026-03-18 16:04:13.110921992 +0000 UTC m=+1801.980250949" watchObservedRunningTime="2026-03-18 16:04:13.117928824 +0000 UTC m=+1801.987257761" Mar 18 16:04:25 crc kubenswrapper[4792]: I0318 16:04:25.854692 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:04:25 crc kubenswrapper[4792]: E0318 16:04:25.855647 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:04:33 crc kubenswrapper[4792]: I0318 16:04:33.355807 4792 generic.go:334] "Generic (PLEG): container finished" podID="b2a1ad0b-1684-4f7b-a7f0-023c7a15286a" containerID="e9db7c5e68e8c8101cdedf445fd783ab61a801457b6134af218b9827eccb0ef5" exitCode=0 Mar 18 16:04:33 crc kubenswrapper[4792]: I0318 16:04:33.355875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a","Type":"ContainerDied","Data":"e9db7c5e68e8c8101cdedf445fd783ab61a801457b6134af218b9827eccb0ef5"} Mar 18 16:04:34 crc kubenswrapper[4792]: I0318 16:04:34.371822 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"b2a1ad0b-1684-4f7b-a7f0-023c7a15286a","Type":"ContainerStarted","Data":"3fc50f64d9d60d6084c0067de11bce0785de8f8e222e95bffe5456dc89877712"} Mar 18 16:04:34 crc kubenswrapper[4792]: I0318 16:04:34.372390 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 18 16:04:34 crc kubenswrapper[4792]: I0318 16:04:34.410965 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=38.410946842 podStartE2EDuration="38.410946842s" podCreationTimestamp="2026-03-18 16:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:04:34.397597084 +0000 UTC m=+1823.266926041" watchObservedRunningTime="2026-03-18 16:04:34.410946842 +0000 UTC m=+1823.280275779" Mar 18 16:04:37 crc kubenswrapper[4792]: I0318 16:04:37.855196 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:04:37 crc kubenswrapper[4792]: E0318 16:04:37.856115 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:04:44 crc kubenswrapper[4792]: I0318 16:04:44.574184 4792 scope.go:117] "RemoveContainer" containerID="88cf61d13abf105698b1cc60922b522f27c36a821f7ee2adcc8b4002d05a4c71" Mar 18 16:04:44 crc kubenswrapper[4792]: I0318 16:04:44.637198 4792 scope.go:117] "RemoveContainer" containerID="3ef6bfddcc01f150b84d703a422b8347269df5148d2c18e20cda9082dac149c5" Mar 18 16:04:44 crc kubenswrapper[4792]: I0318 16:04:44.729373 4792 scope.go:117] "RemoveContainer" containerID="fb80b33f4c260a249e77205e69bd3991689f30d26a856db44390982be8e4d6a0" Mar 18 16:04:47 crc kubenswrapper[4792]: I0318 16:04:47.581200 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 18 16:04:47 crc kubenswrapper[4792]: I0318 16:04:47.689816 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:49 crc kubenswrapper[4792]: I0318 16:04:49.854920 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:04:49 crc kubenswrapper[4792]: E0318 16:04:49.855772 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:04:51 crc kubenswrapper[4792]: I0318 16:04:51.906563 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerName="rabbitmq" containerID="cri-o://a3bc29f7139dbf87b29ddc94cff1982a1d93cede423621e24bd609c3d2e251e1" gracePeriod=604796 Mar 18 16:04:56 crc kubenswrapper[4792]: I0318 16:04:56.761758 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.675554 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerID="a3bc29f7139dbf87b29ddc94cff1982a1d93cede423621e24bd609c3d2e251e1" exitCode=0 Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.675648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1","Type":"ContainerDied","Data":"a3bc29f7139dbf87b29ddc94cff1982a1d93cede423621e24bd609c3d2e251e1"} Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.676234 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1","Type":"ContainerDied","Data":"3c6df70d6a00540530ae4d5ae33e5ae998e9eaa9e64a681b6bf004040ca3e165"} Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.676254 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6df70d6a00540530ae4d5ae33e5ae998e9eaa9e64a681b6bf004040ca3e165" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.686538 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.792676 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-erlang-cookie\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.793562 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.793889 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-tls\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.793939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-server-conf\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.793970 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-confd\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.794062 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-plugins-conf\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.794177 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.794239 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wltw\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-kube-api-access-7wltw\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.794764 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-plugins\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.794848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-config-data\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.794899 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-pod-info\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.795017 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-erlang-cookie-secret\") pod \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\" (UID: \"a3217a72-3ad4-4bb5-bf86-c1daa2e409c1\") " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.795255 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.795934 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.798199 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.798234 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.798247 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.801353 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.802775 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-pod-info" (OuterVolumeSpecName: "pod-info") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.805157 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-kube-api-access-7wltw" (OuterVolumeSpecName: "kube-api-access-7wltw") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "kube-api-access-7wltw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.811355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.847704 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290" (OuterVolumeSpecName: "persistence") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.864458 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-config-data" (OuterVolumeSpecName: "config-data") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.892450 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-server-conf" (OuterVolumeSpecName: "server-conf") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.901487 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") on node \"crc\" " Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.901527 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.901543 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.901556 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wltw\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-kube-api-access-7wltw\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.901571 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.901583 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.901594 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.941884 4792 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.942062 4792 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290") on node "crc" Mar 18 16:04:58 crc kubenswrapper[4792]: I0318 16:04:58.987399 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" (UID: "a3217a72-3ad4-4bb5-bf86-c1daa2e409c1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.003371 4792 reconciler_common.go:293] "Volume detached for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.003407 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.690290 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.730797 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.749802 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.772600 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:59 crc kubenswrapper[4792]: E0318 16:04:59.773231 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerName="setup-container" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.773254 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerName="setup-container" Mar 18 16:04:59 crc kubenswrapper[4792]: E0318 16:04:59.773330 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerName="rabbitmq" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.773340 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerName="rabbitmq" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.773636 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" containerName="rabbitmq" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.775262 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.802403 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.829794 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b4b6514-5ee6-4653-acfc-b45efe6e7263-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.829860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.829889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.829969 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.830024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.830042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b4b6514-5ee6-4653-acfc-b45efe6e7263-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.830075 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.830104 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.830587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.830678 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.830885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkxf7\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-kube-api-access-nkxf7\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.882767 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3217a72-3ad4-4bb5-bf86-c1daa2e409c1" path="/var/lib/kubelet/pods/a3217a72-3ad4-4bb5-bf86-c1daa2e409c1/volumes" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.933341 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.933420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.933447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b4b6514-5ee6-4653-acfc-b45efe6e7263-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.933502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.933577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.934374 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.934702 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.934762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.934795 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.935198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkxf7\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-kube-api-access-nkxf7\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.935315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b4b6514-5ee6-4653-acfc-b45efe6e7263-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.935401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.935444 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.935509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.935743 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.935773 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1628534424672959792f013c755b2348fd03dde948952a742875a82406539b79/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.936780 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.938931 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b4b6514-5ee6-4653-acfc-b45efe6e7263-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.939616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b4b6514-5ee6-4653-acfc-b45efe6e7263-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.941267 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.942296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.942620 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b4b6514-5ee6-4653-acfc-b45efe6e7263-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:04:59 crc kubenswrapper[4792]: I0318 16:04:59.959316 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkxf7\" (UniqueName: \"kubernetes.io/projected/7b4b6514-5ee6-4653-acfc-b45efe6e7263-kube-api-access-nkxf7\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:05:00 crc kubenswrapper[4792]: I0318 16:05:00.034032 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f019-ba52-4bd1-9aea-09f3ec02c290\") pod \"rabbitmq-server-0\" (UID: \"7b4b6514-5ee6-4653-acfc-b45efe6e7263\") " pod="openstack/rabbitmq-server-0" Mar 18 16:05:00 crc kubenswrapper[4792]: I0318 16:05:00.104264 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 16:05:00 crc kubenswrapper[4792]: I0318 16:05:00.602085 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 16:05:00 crc kubenswrapper[4792]: W0318 16:05:00.605480 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b4b6514_5ee6_4653_acfc_b45efe6e7263.slice/crio-158291374bbfff6b6b1bd081b27061fb5a97d5056eb1c76f8554d4665e3b55f0 WatchSource:0}: Error finding container 158291374bbfff6b6b1bd081b27061fb5a97d5056eb1c76f8554d4665e3b55f0: Status 404 returned error can't find the container with id 158291374bbfff6b6b1bd081b27061fb5a97d5056eb1c76f8554d4665e3b55f0 Mar 18 16:05:00 crc kubenswrapper[4792]: I0318 16:05:00.705935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b4b6514-5ee6-4653-acfc-b45efe6e7263","Type":"ContainerStarted","Data":"158291374bbfff6b6b1bd081b27061fb5a97d5056eb1c76f8554d4665e3b55f0"} Mar 18 16:05:02 crc kubenswrapper[4792]: I0318 16:05:02.740612 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b4b6514-5ee6-4653-acfc-b45efe6e7263","Type":"ContainerStarted","Data":"92925ec79bff47e1b8dff94de7572356277d66dffc28a891bae796d96be4f7f7"} Mar 18 16:05:04 crc kubenswrapper[4792]: I0318 16:05:04.854803 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:05:05 crc kubenswrapper[4792]: I0318 16:05:05.775878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"307ae670042f37d1c86800fa368bad970319cab3747c90e3d5fdc192ca393de6"} Mar 18 16:05:35 crc kubenswrapper[4792]: I0318 16:05:35.111416 4792 generic.go:334] "Generic (PLEG): container finished" podID="7b4b6514-5ee6-4653-acfc-b45efe6e7263" containerID="92925ec79bff47e1b8dff94de7572356277d66dffc28a891bae796d96be4f7f7" exitCode=0 Mar 18 16:05:35 crc kubenswrapper[4792]: I0318 16:05:35.111540 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b4b6514-5ee6-4653-acfc-b45efe6e7263","Type":"ContainerDied","Data":"92925ec79bff47e1b8dff94de7572356277d66dffc28a891bae796d96be4f7f7"} Mar 18 16:05:36 crc kubenswrapper[4792]: I0318 16:05:36.125157 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b4b6514-5ee6-4653-acfc-b45efe6e7263","Type":"ContainerStarted","Data":"c26f180e067afc4194cb410ad805fb5f04cb4197646591a36474b6d65f5dd66e"} Mar 18 16:05:36 crc kubenswrapper[4792]: I0318 16:05:36.125627 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 16:05:36 crc kubenswrapper[4792]: I0318 16:05:36.152482 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.152441773 podStartE2EDuration="37.152441773s" podCreationTimestamp="2026-03-18 16:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:05:36.147823128 +0000 UTC m=+1885.017152085" watchObservedRunningTime="2026-03-18 16:05:36.152441773 +0000 UTC m=+1885.021770710" Mar 18 16:05:44 crc kubenswrapper[4792]: I0318 16:05:44.942207 4792 scope.go:117] "RemoveContainer" containerID="473056b2f961908787d6d0bbd7279324cef361cac398147642946a887e53c916" Mar 18 16:05:44 crc kubenswrapper[4792]: I0318 16:05:44.998563 4792 scope.go:117] "RemoveContainer" containerID="a3bc29f7139dbf87b29ddc94cff1982a1d93cede423621e24bd609c3d2e251e1" Mar 18 16:05:45 crc kubenswrapper[4792]: I0318 16:05:45.034478 4792 scope.go:117] "RemoveContainer" containerID="9119ca9c98137af4d67b7b0e56211f071e93407c34b0d86e6d14b21e1aaca9b8" Mar 18 16:05:50 crc kubenswrapper[4792]: I0318 16:05:50.107162 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.148615 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564166-6shz9"] Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.150708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-6shz9" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.154258 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.155074 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.157201 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.162059 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-6shz9"] Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.251219 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htpxd\" (UniqueName: \"kubernetes.io/projected/9a53da3c-f3b8-4a0b-8ce6-c71399912c79-kube-api-access-htpxd\") pod \"auto-csr-approver-29564166-6shz9\" (UID: \"9a53da3c-f3b8-4a0b-8ce6-c71399912c79\") " pod="openshift-infra/auto-csr-approver-29564166-6shz9" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.354126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htpxd\" (UniqueName: \"kubernetes.io/projected/9a53da3c-f3b8-4a0b-8ce6-c71399912c79-kube-api-access-htpxd\") pod \"auto-csr-approver-29564166-6shz9\" (UID: \"9a53da3c-f3b8-4a0b-8ce6-c71399912c79\") " pod="openshift-infra/auto-csr-approver-29564166-6shz9" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.376954 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htpxd\" (UniqueName: \"kubernetes.io/projected/9a53da3c-f3b8-4a0b-8ce6-c71399912c79-kube-api-access-htpxd\") pod \"auto-csr-approver-29564166-6shz9\" (UID: \"9a53da3c-f3b8-4a0b-8ce6-c71399912c79\") " pod="openshift-infra/auto-csr-approver-29564166-6shz9" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.474801 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-6shz9" Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.941943 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-6shz9"] Mar 18 16:06:00 crc kubenswrapper[4792]: I0318 16:06:00.943389 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:06:01 crc kubenswrapper[4792]: I0318 16:06:01.415920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-6shz9" event={"ID":"9a53da3c-f3b8-4a0b-8ce6-c71399912c79","Type":"ContainerStarted","Data":"2bd2a90e9ee5e6fc52e3bb5e414eac8fcbd7f9b0318255c4df0d3b9d80a948a4"} Mar 18 16:06:02 crc kubenswrapper[4792]: I0318 16:06:02.435317 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-6shz9" event={"ID":"9a53da3c-f3b8-4a0b-8ce6-c71399912c79","Type":"ContainerStarted","Data":"9901a8f054b8edc0cb76beda24d571e1f848738729ad7ee4328cf25618e6ecbe"} Mar 18 16:06:02 crc kubenswrapper[4792]: I0318 16:06:02.465129 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564166-6shz9" podStartSLOduration=1.298232245 podStartE2EDuration="2.465100858s" podCreationTimestamp="2026-03-18 16:06:00 +0000 UTC" firstStartedPulling="2026-03-18 16:06:00.943188009 +0000 UTC m=+1909.812516946" lastFinishedPulling="2026-03-18 16:06:02.110056622 +0000 UTC m=+1910.979385559" observedRunningTime="2026-03-18 16:06:02.453436133 +0000 UTC m=+1911.322765150" watchObservedRunningTime="2026-03-18 16:06:02.465100858 +0000 UTC m=+1911.334429825" Mar 18 16:06:03 crc kubenswrapper[4792]: I0318 16:06:03.453045 4792 generic.go:334] "Generic (PLEG): container finished" podID="9a53da3c-f3b8-4a0b-8ce6-c71399912c79" containerID="9901a8f054b8edc0cb76beda24d571e1f848738729ad7ee4328cf25618e6ecbe" exitCode=0 Mar 18 16:06:03 crc kubenswrapper[4792]: I0318 16:06:03.453371 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-6shz9" event={"ID":"9a53da3c-f3b8-4a0b-8ce6-c71399912c79","Type":"ContainerDied","Data":"9901a8f054b8edc0cb76beda24d571e1f848738729ad7ee4328cf25618e6ecbe"} Mar 18 16:06:04 crc kubenswrapper[4792]: I0318 16:06:04.898630 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-6shz9" Mar 18 16:06:04 crc kubenswrapper[4792]: I0318 16:06:04.983223 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htpxd\" (UniqueName: \"kubernetes.io/projected/9a53da3c-f3b8-4a0b-8ce6-c71399912c79-kube-api-access-htpxd\") pod \"9a53da3c-f3b8-4a0b-8ce6-c71399912c79\" (UID: \"9a53da3c-f3b8-4a0b-8ce6-c71399912c79\") " Mar 18 16:06:04 crc kubenswrapper[4792]: I0318 16:06:04.989182 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a53da3c-f3b8-4a0b-8ce6-c71399912c79-kube-api-access-htpxd" (OuterVolumeSpecName: "kube-api-access-htpxd") pod "9a53da3c-f3b8-4a0b-8ce6-c71399912c79" (UID: "9a53da3c-f3b8-4a0b-8ce6-c71399912c79"). InnerVolumeSpecName "kube-api-access-htpxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:06:05 crc kubenswrapper[4792]: I0318 16:06:05.087287 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htpxd\" (UniqueName: \"kubernetes.io/projected/9a53da3c-f3b8-4a0b-8ce6-c71399912c79-kube-api-access-htpxd\") on node \"crc\" DevicePath \"\"" Mar 18 16:06:05 crc kubenswrapper[4792]: I0318 16:06:05.478004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564166-6shz9" event={"ID":"9a53da3c-f3b8-4a0b-8ce6-c71399912c79","Type":"ContainerDied","Data":"2bd2a90e9ee5e6fc52e3bb5e414eac8fcbd7f9b0318255c4df0d3b9d80a948a4"} Mar 18 16:06:05 crc kubenswrapper[4792]: I0318 16:06:05.478072 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bd2a90e9ee5e6fc52e3bb5e414eac8fcbd7f9b0318255c4df0d3b9d80a948a4" Mar 18 16:06:05 crc kubenswrapper[4792]: I0318 16:06:05.478441 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564166-6shz9" Mar 18 16:06:05 crc kubenswrapper[4792]: I0318 16:06:05.969636 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-vrzft"] Mar 18 16:06:05 crc kubenswrapper[4792]: I0318 16:06:05.981346 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564160-vrzft"] Mar 18 16:06:07 crc kubenswrapper[4792]: I0318 16:06:07.872264 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9fab6a-e73c-4238-9f8b-a37af84edd72" path="/var/lib/kubelet/pods/3a9fab6a-e73c-4238-9f8b-a37af84edd72/volumes" Mar 18 16:06:45 crc kubenswrapper[4792]: I0318 16:06:45.129404 4792 scope.go:117] "RemoveContainer" containerID="d047c24c5ec62e5910239540f5637cac1516bdfb1ea1a76d7a82c98a343a06a3" Mar 18 16:06:45 crc kubenswrapper[4792]: I0318 16:06:45.168143 4792 scope.go:117] "RemoveContainer" containerID="98ba62d9ae84759c61ea2884a57c6995a098d7af6049c24ba54da12f5d6e52ac" Mar 18 16:06:45 crc kubenswrapper[4792]: I0318 16:06:45.195166 4792 scope.go:117] "RemoveContainer" containerID="6a93449c805557502ad71f976f0fa2e7e80e09a70ca97dab75d47cb15d1ce211" Mar 18 16:06:48 crc kubenswrapper[4792]: I0318 16:06:48.060388 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4ww62"] Mar 18 16:06:48 crc kubenswrapper[4792]: I0318 16:06:48.079901 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4ww62"] Mar 18 16:06:49 crc kubenswrapper[4792]: I0318 16:06:49.873785 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c36889-6457-497c-b5bc-3c55e80356cf" path="/var/lib/kubelet/pods/09c36889-6457-497c-b5bc-3c55e80356cf/volumes" Mar 18 16:06:56 crc kubenswrapper[4792]: I0318 16:06:56.034149 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5d26-account-create-update-r7cpt"] Mar 18 16:06:56 crc kubenswrapper[4792]: I0318 16:06:56.046270 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5d26-account-create-update-r7cpt"] Mar 18 16:06:57 crc kubenswrapper[4792]: I0318 16:06:57.037654 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-e409-account-create-update-jmr8x"] Mar 18 16:06:57 crc kubenswrapper[4792]: I0318 16:06:57.063121 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-e409-account-create-update-jmr8x"] Mar 18 16:06:57 crc kubenswrapper[4792]: I0318 16:06:57.078213 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7skkc"] Mar 18 16:06:57 crc kubenswrapper[4792]: I0318 16:06:57.090729 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7skkc"] Mar 18 16:06:57 crc kubenswrapper[4792]: I0318 16:06:57.874955 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb6d4a8-112c-41fa-b6f2-41b6d7882908" path="/var/lib/kubelet/pods/3bb6d4a8-112c-41fa-b6f2-41b6d7882908/volumes" Mar 18 16:06:57 crc kubenswrapper[4792]: I0318 16:06:57.876676 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d342d2e-9afe-442f-9740-ba8f58e4f2b4" path="/var/lib/kubelet/pods/4d342d2e-9afe-442f-9740-ba8f58e4f2b4/volumes" Mar 18 16:06:57 crc kubenswrapper[4792]: I0318 16:06:57.878230 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f526f38-b789-46a5-96c9-9d2c5a820e51" path="/var/lib/kubelet/pods/6f526f38-b789-46a5-96c9-9d2c5a820e51/volumes" Mar 18 16:07:02 crc kubenswrapper[4792]: I0318 16:07:02.033889 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-98a6-account-create-update-hg25j"] Mar 18 16:07:02 crc kubenswrapper[4792]: I0318 16:07:02.050454 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-98a6-account-create-update-hg25j"] Mar 18 16:07:03 crc kubenswrapper[4792]: I0318 16:07:03.868184 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be42760-b224-4ca3-8870-92131f90c77b" path="/var/lib/kubelet/pods/6be42760-b224-4ca3-8870-92131f90c77b/volumes" Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.055634 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ft5cr"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.066626 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-n7whx"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.080806 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-tldpp"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.089812 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-7806-account-create-update-vflmx"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.102457 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ft5cr"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.116385 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-n7whx"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.128887 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f402-account-create-update-qxl8l"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.142103 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-7806-account-create-update-vflmx"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.153614 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-tldpp"] Mar 18 16:07:04 crc kubenswrapper[4792]: I0318 16:07:04.164921 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f402-account-create-update-qxl8l"] Mar 18 16:07:05 crc kubenswrapper[4792]: I0318 16:07:05.160612 4792 generic.go:334] "Generic (PLEG): container finished" podID="e8427835-8b71-4705-91e2-d82092ec93f5" containerID="da65379d15e79987c043376e5bde69dbd95488563a7f890e262fda97d1f83cb3" exitCode=0 Mar 18 16:07:05 crc kubenswrapper[4792]: I0318 16:07:05.160721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" event={"ID":"e8427835-8b71-4705-91e2-d82092ec93f5","Type":"ContainerDied","Data":"da65379d15e79987c043376e5bde69dbd95488563a7f890e262fda97d1f83cb3"} Mar 18 16:07:05 crc kubenswrapper[4792]: I0318 16:07:05.878224 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2156b56d-e393-4230-b0ae-057041cee710" path="/var/lib/kubelet/pods/2156b56d-e393-4230-b0ae-057041cee710/volumes" Mar 18 16:07:05 crc kubenswrapper[4792]: I0318 16:07:05.882379 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1fbe7b-1a12-45d5-aa20-b56d3fad539f" path="/var/lib/kubelet/pods/3b1fbe7b-1a12-45d5-aa20-b56d3fad539f/volumes" Mar 18 16:07:05 crc kubenswrapper[4792]: I0318 16:07:05.884368 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fe6d817-db7d-4864-9cfb-1a399587c3b9" path="/var/lib/kubelet/pods/9fe6d817-db7d-4864-9cfb-1a399587c3b9/volumes" Mar 18 16:07:05 crc kubenswrapper[4792]: I0318 16:07:05.888459 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a93f5d6c-7b86-47bd-952e-a1a563065c76" path="/var/lib/kubelet/pods/a93f5d6c-7b86-47bd-952e-a1a563065c76/volumes" Mar 18 16:07:05 crc kubenswrapper[4792]: I0318 16:07:05.890164 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0402169-72db-460a-a8ae-0e6f8dbd696b" path="/var/lib/kubelet/pods/c0402169-72db-460a-a8ae-0e6f8dbd696b/volumes" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.670477 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.771820 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24dhl\" (UniqueName: \"kubernetes.io/projected/e8427835-8b71-4705-91e2-d82092ec93f5-kube-api-access-24dhl\") pod \"e8427835-8b71-4705-91e2-d82092ec93f5\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.772140 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-bootstrap-combined-ca-bundle\") pod \"e8427835-8b71-4705-91e2-d82092ec93f5\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.772469 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-ssh-key-openstack-edpm-ipam\") pod \"e8427835-8b71-4705-91e2-d82092ec93f5\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.772656 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-inventory\") pod \"e8427835-8b71-4705-91e2-d82092ec93f5\" (UID: \"e8427835-8b71-4705-91e2-d82092ec93f5\") " Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.778253 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e8427835-8b71-4705-91e2-d82092ec93f5" (UID: "e8427835-8b71-4705-91e2-d82092ec93f5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.778491 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8427835-8b71-4705-91e2-d82092ec93f5-kube-api-access-24dhl" (OuterVolumeSpecName: "kube-api-access-24dhl") pod "e8427835-8b71-4705-91e2-d82092ec93f5" (UID: "e8427835-8b71-4705-91e2-d82092ec93f5"). InnerVolumeSpecName "kube-api-access-24dhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.814382 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-inventory" (OuterVolumeSpecName: "inventory") pod "e8427835-8b71-4705-91e2-d82092ec93f5" (UID: "e8427835-8b71-4705-91e2-d82092ec93f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.815047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e8427835-8b71-4705-91e2-d82092ec93f5" (UID: "e8427835-8b71-4705-91e2-d82092ec93f5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.880330 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.880397 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.880415 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8427835-8b71-4705-91e2-d82092ec93f5-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:07:06 crc kubenswrapper[4792]: I0318 16:07:06.880427 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24dhl\" (UniqueName: \"kubernetes.io/projected/e8427835-8b71-4705-91e2-d82092ec93f5-kube-api-access-24dhl\") on node \"crc\" DevicePath \"\"" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.195332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" event={"ID":"e8427835-8b71-4705-91e2-d82092ec93f5","Type":"ContainerDied","Data":"5932d0269dc958873772095a842a6564a6a9ac7b4bd9d538d8b3b0e03ef94a30"} Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.195674 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5932d0269dc958873772095a842a6564a6a9ac7b4bd9d538d8b3b0e03ef94a30" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.195765 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.291205 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm"] Mar 18 16:07:07 crc kubenswrapper[4792]: E0318 16:07:07.291874 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a53da3c-f3b8-4a0b-8ce6-c71399912c79" containerName="oc" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.291897 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a53da3c-f3b8-4a0b-8ce6-c71399912c79" containerName="oc" Mar 18 16:07:07 crc kubenswrapper[4792]: E0318 16:07:07.291917 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8427835-8b71-4705-91e2-d82092ec93f5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.291929 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8427835-8b71-4705-91e2-d82092ec93f5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.292255 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a53da3c-f3b8-4a0b-8ce6-c71399912c79" containerName="oc" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.292296 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8427835-8b71-4705-91e2-d82092ec93f5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.293214 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.295498 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.296139 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.296984 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.297635 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.322526 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm"] Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.407152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.407290 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hvg\" (UniqueName: \"kubernetes.io/projected/39d160c7-decc-4473-9c4f-f1282a927485-kube-api-access-z7hvg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.407396 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.509748 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.509811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hvg\" (UniqueName: \"kubernetes.io/projected/39d160c7-decc-4473-9c4f-f1282a927485-kube-api-access-z7hvg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.509851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.514841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.520595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.531353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hvg\" (UniqueName: \"kubernetes.io/projected/39d160c7-decc-4473-9c4f-f1282a927485-kube-api-access-z7hvg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:07 crc kubenswrapper[4792]: I0318 16:07:07.616617 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:07:08 crc kubenswrapper[4792]: I0318 16:07:08.248907 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm"] Mar 18 16:07:09 crc kubenswrapper[4792]: I0318 16:07:09.218406 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" event={"ID":"39d160c7-decc-4473-9c4f-f1282a927485","Type":"ContainerStarted","Data":"0cb13e74f75ca807a69ef3afecd724a59ee13cbc8bfe75ba98e1ba9f571f44f9"} Mar 18 16:07:10 crc kubenswrapper[4792]: I0318 16:07:10.229825 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" event={"ID":"39d160c7-decc-4473-9c4f-f1282a927485","Type":"ContainerStarted","Data":"9b0176a2981f001922fd253d5f6006a7db05616e9dd5451132b9076d10643b59"} Mar 18 16:07:10 crc kubenswrapper[4792]: I0318 16:07:10.251677 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" podStartSLOduration=2.496013278 podStartE2EDuration="3.251656425s" podCreationTimestamp="2026-03-18 16:07:07 +0000 UTC" firstStartedPulling="2026-03-18 16:07:08.231143809 +0000 UTC m=+1977.100472746" lastFinishedPulling="2026-03-18 16:07:08.986786956 +0000 UTC m=+1977.856115893" observedRunningTime="2026-03-18 16:07:10.244207711 +0000 UTC m=+1979.113536658" watchObservedRunningTime="2026-03-18 16:07:10.251656425 +0000 UTC m=+1979.120985372" Mar 18 16:07:30 crc kubenswrapper[4792]: I0318 16:07:30.322071 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:07:30 crc kubenswrapper[4792]: I0318 16:07:30.322601 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.049258 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-da5b-account-create-update-t7nwf"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.065251 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-9t8k4"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.079813 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rrksn"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.094434 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sb8pv"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.108413 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-65b5-account-create-update-d89tz"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.119884 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sb8pv"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.133160 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-da5b-account-create-update-t7nwf"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.147603 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-9t8k4"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.161752 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-65b5-account-create-update-d89tz"] Mar 18 16:07:40 crc kubenswrapper[4792]: I0318 16:07:40.172317 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rrksn"] Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.061786 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-644b-account-create-update-slgb4"] Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.077539 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5wbx7"] Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.088731 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-644b-account-create-update-slgb4"] Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.098376 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b0df-account-create-update-2r2xz"] Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.110606 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b0df-account-create-update-2r2xz"] Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.120309 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5wbx7"] Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.872779 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124caafa-8fb5-40be-b0bb-233a7848176f" path="/var/lib/kubelet/pods/124caafa-8fb5-40be-b0bb-233a7848176f/volumes" Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.874829 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162c4e7b-9b94-4363-aa4e-25cbb6cce669" path="/var/lib/kubelet/pods/162c4e7b-9b94-4363-aa4e-25cbb6cce669/volumes" Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.877626 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea659dd-3f5f-4942-9c6c-ad15ec82bd58" path="/var/lib/kubelet/pods/2ea659dd-3f5f-4942-9c6c-ad15ec82bd58/volumes" Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.882374 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3282b7a4-a673-4f26-9395-3fbcfe76fea4" path="/var/lib/kubelet/pods/3282b7a4-a673-4f26-9395-3fbcfe76fea4/volumes" Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.883127 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6755e276-5f4a-45db-850e-97ff887e55ae" path="/var/lib/kubelet/pods/6755e276-5f4a-45db-850e-97ff887e55ae/volumes" Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.884708 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46fc90c-8795-46d0-b6b4-e386c126ff37" path="/var/lib/kubelet/pods/d46fc90c-8795-46d0-b6b4-e386c126ff37/volumes" Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.886274 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddad064f-85b2-4334-9b1f-af2e8037a328" path="/var/lib/kubelet/pods/ddad064f-85b2-4334-9b1f-af2e8037a328/volumes" Mar 18 16:07:41 crc kubenswrapper[4792]: I0318 16:07:41.887474 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f14da68d-4090-4890-ba72-195a943a722b" path="/var/lib/kubelet/pods/f14da68d-4090-4890-ba72-195a943a722b/volumes" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.318221 4792 scope.go:117] "RemoveContainer" containerID="26b8254bf32c9f070adef10c58cc4b1b07e1b20350731bbbf7b72ab9160d432f" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.349928 4792 scope.go:117] "RemoveContainer" containerID="63760b16c1843464a4d0e076f15cd2ffbf94c45592f7020559517429c3016f13" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.406266 4792 scope.go:117] "RemoveContainer" containerID="8459a1a5f8dec12d555191105872ff4ec2ee9b3fd2d809b4dddb67fe98be2914" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.464290 4792 scope.go:117] "RemoveContainer" containerID="3ebe52b0c9be12f9ba65dfc6e682f55a9d94b0898f08328ae8aaf62639e8df07" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.542037 4792 scope.go:117] "RemoveContainer" containerID="664ad61c5e3740497e16c283a9ff433392cdb65c4e1a8dc0d57514d174a761d1" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.584842 4792 scope.go:117] "RemoveContainer" containerID="c486ec8b4994b185991d3be76c07402a4ffa86507c4323f1f2233014197038cb" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.655155 4792 scope.go:117] "RemoveContainer" containerID="ca2d28e25db123a3fa0db97e3652bdd71a0e3252324bd76ee9d2b126206ac73b" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.694384 4792 scope.go:117] "RemoveContainer" containerID="f16c1aa5e8f590b86442dd8e688ed4bc4f4200e4f31730326640ee2d9898930d" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.721031 4792 scope.go:117] "RemoveContainer" containerID="dc5297a5d704e6189454e7f868c627788f3f4cc640d63594e9197c9c1b062f0d" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.741506 4792 scope.go:117] "RemoveContainer" containerID="39815edeed392e180c806a188fbbabdf3f1434bf2d373688b56d7e4407264003" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.766587 4792 scope.go:117] "RemoveContainer" containerID="d4da398f7cccec42ba8c48cf52a1a5c57dfaf51bd19332afaf30cc84d2b62334" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.795396 4792 scope.go:117] "RemoveContainer" containerID="33d6e242b773390ac2be00843c0f5628fccbd0d8a9dadbdcab6df60e794e709b" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.822155 4792 scope.go:117] "RemoveContainer" containerID="511db70907d804b39c2b2c8e66c764c42732087e3dfbececfac0290967a0684d" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.843366 4792 scope.go:117] "RemoveContainer" containerID="b506889efdf1ae7205f5d60ee0985b148c37a6c0003af6830978158b814dacfc" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.870348 4792 scope.go:117] "RemoveContainer" containerID="dc0b9d721852424d88be1ae20b397b8b295c0f70a1ff28b55d95b176afb4fee5" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.907767 4792 scope.go:117] "RemoveContainer" containerID="0ab2646bb2f66bfaa423606ab3cb52b9cf7ad5033855396c58f9ecbd885b3a62" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.965096 4792 scope.go:117] "RemoveContainer" containerID="017b0d5434ac40aa8e667a47bc17ea84d73b66c78bfeaa2d88307c0f7115fb8e" Mar 18 16:07:45 crc kubenswrapper[4792]: I0318 16:07:45.997501 4792 scope.go:117] "RemoveContainer" containerID="eb0e75a05b5ce80826ff0b1dcb9f596dffa9606205a0b26e894555066e9cb2d0" Mar 18 16:07:46 crc kubenswrapper[4792]: I0318 16:07:46.028835 4792 scope.go:117] "RemoveContainer" containerID="828e7cc77860e7196d1cbe036ea1978e973b72a92ba3b6d8c571f80418c6e60a" Mar 18 16:07:48 crc kubenswrapper[4792]: I0318 16:07:48.032375 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5x8d9"] Mar 18 16:07:48 crc kubenswrapper[4792]: I0318 16:07:48.045909 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5x8d9"] Mar 18 16:07:49 crc kubenswrapper[4792]: I0318 16:07:49.876280 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7efa06c0-f363-43b0-ba89-96d41ff9db74" path="/var/lib/kubelet/pods/7efa06c0-f363-43b0-ba89-96d41ff9db74/volumes" Mar 18 16:07:54 crc kubenswrapper[4792]: I0318 16:07:54.035805 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9zvzr"] Mar 18 16:07:54 crc kubenswrapper[4792]: I0318 16:07:54.052945 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9zvzr"] Mar 18 16:07:55 crc kubenswrapper[4792]: I0318 16:07:55.868060 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84be4145-eccb-4329-a407-0ecb688a6b20" path="/var/lib/kubelet/pods/84be4145-eccb-4329-a407-0ecb688a6b20/volumes" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.156094 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564168-qcsc6"] Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.158269 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-qcsc6" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.160752 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.164420 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.164420 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.167227 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-qcsc6"] Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.250719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rvn\" (UniqueName: \"kubernetes.io/projected/c867cce4-ccfb-4dc8-baed-fb506ed4d909-kube-api-access-58rvn\") pod \"auto-csr-approver-29564168-qcsc6\" (UID: \"c867cce4-ccfb-4dc8-baed-fb506ed4d909\") " pod="openshift-infra/auto-csr-approver-29564168-qcsc6" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.321674 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.321735 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.353588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rvn\" (UniqueName: \"kubernetes.io/projected/c867cce4-ccfb-4dc8-baed-fb506ed4d909-kube-api-access-58rvn\") pod \"auto-csr-approver-29564168-qcsc6\" (UID: \"c867cce4-ccfb-4dc8-baed-fb506ed4d909\") " pod="openshift-infra/auto-csr-approver-29564168-qcsc6" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.373200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rvn\" (UniqueName: \"kubernetes.io/projected/c867cce4-ccfb-4dc8-baed-fb506ed4d909-kube-api-access-58rvn\") pod \"auto-csr-approver-29564168-qcsc6\" (UID: \"c867cce4-ccfb-4dc8-baed-fb506ed4d909\") " pod="openshift-infra/auto-csr-approver-29564168-qcsc6" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.478026 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-qcsc6" Mar 18 16:08:00 crc kubenswrapper[4792]: I0318 16:08:00.960577 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-qcsc6"] Mar 18 16:08:01 crc kubenswrapper[4792]: I0318 16:08:01.867397 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-qcsc6" event={"ID":"c867cce4-ccfb-4dc8-baed-fb506ed4d909","Type":"ContainerStarted","Data":"4cbde272ae7eb3ec010ee6b615f6dd601acfc9284e2b1e8258a4376663101729"} Mar 18 16:08:02 crc kubenswrapper[4792]: I0318 16:08:02.878095 4792 generic.go:334] "Generic (PLEG): container finished" podID="c867cce4-ccfb-4dc8-baed-fb506ed4d909" containerID="8c470ec44517b7477672daeaf399d3ec1c62b0cdd4573ec6ad924f77e13eb28a" exitCode=0 Mar 18 16:08:02 crc kubenswrapper[4792]: I0318 16:08:02.878590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-qcsc6" event={"ID":"c867cce4-ccfb-4dc8-baed-fb506ed4d909","Type":"ContainerDied","Data":"8c470ec44517b7477672daeaf399d3ec1c62b0cdd4573ec6ad924f77e13eb28a"} Mar 18 16:08:04 crc kubenswrapper[4792]: I0318 16:08:04.309735 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-qcsc6" Mar 18 16:08:04 crc kubenswrapper[4792]: I0318 16:08:04.404436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58rvn\" (UniqueName: \"kubernetes.io/projected/c867cce4-ccfb-4dc8-baed-fb506ed4d909-kube-api-access-58rvn\") pod \"c867cce4-ccfb-4dc8-baed-fb506ed4d909\" (UID: \"c867cce4-ccfb-4dc8-baed-fb506ed4d909\") " Mar 18 16:08:04 crc kubenswrapper[4792]: I0318 16:08:04.414619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c867cce4-ccfb-4dc8-baed-fb506ed4d909-kube-api-access-58rvn" (OuterVolumeSpecName: "kube-api-access-58rvn") pod "c867cce4-ccfb-4dc8-baed-fb506ed4d909" (UID: "c867cce4-ccfb-4dc8-baed-fb506ed4d909"). InnerVolumeSpecName "kube-api-access-58rvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:08:04 crc kubenswrapper[4792]: I0318 16:08:04.506334 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58rvn\" (UniqueName: \"kubernetes.io/projected/c867cce4-ccfb-4dc8-baed-fb506ed4d909-kube-api-access-58rvn\") on node \"crc\" DevicePath \"\"" Mar 18 16:08:04 crc kubenswrapper[4792]: I0318 16:08:04.904179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564168-qcsc6" event={"ID":"c867cce4-ccfb-4dc8-baed-fb506ed4d909","Type":"ContainerDied","Data":"4cbde272ae7eb3ec010ee6b615f6dd601acfc9284e2b1e8258a4376663101729"} Mar 18 16:08:04 crc kubenswrapper[4792]: I0318 16:08:04.904213 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564168-qcsc6" Mar 18 16:08:04 crc kubenswrapper[4792]: I0318 16:08:04.904228 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cbde272ae7eb3ec010ee6b615f6dd601acfc9284e2b1e8258a4376663101729" Mar 18 16:08:05 crc kubenswrapper[4792]: I0318 16:08:05.388752 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-z5rgz"] Mar 18 16:08:05 crc kubenswrapper[4792]: I0318 16:08:05.403306 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564162-z5rgz"] Mar 18 16:08:05 crc kubenswrapper[4792]: I0318 16:08:05.869035 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5954f4a-541b-4146-89bd-eda39e5a9664" path="/var/lib/kubelet/pods/a5954f4a-541b-4146-89bd-eda39e5a9664/volumes" Mar 18 16:08:08 crc kubenswrapper[4792]: I0318 16:08:08.029212 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mqtz6"] Mar 18 16:08:08 crc kubenswrapper[4792]: I0318 16:08:08.039882 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mqtz6"] Mar 18 16:08:09 crc kubenswrapper[4792]: I0318 16:08:09.871828 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4c925f-3e43-4351-9703-7fdd44a1a9d6" path="/var/lib/kubelet/pods/bc4c925f-3e43-4351-9703-7fdd44a1a9d6/volumes" Mar 18 16:08:20 crc kubenswrapper[4792]: I0318 16:08:20.042995 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-tjtqm"] Mar 18 16:08:20 crc kubenswrapper[4792]: I0318 16:08:20.057339 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-tjtqm"] Mar 18 16:08:21 crc kubenswrapper[4792]: I0318 16:08:21.867469 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5416678-c001-4966-a019-56f29f29adc7" path="/var/lib/kubelet/pods/e5416678-c001-4966-a019-56f29f29adc7/volumes" Mar 18 16:08:30 crc kubenswrapper[4792]: I0318 16:08:30.321816 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:08:30 crc kubenswrapper[4792]: I0318 16:08:30.322457 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:08:30 crc kubenswrapper[4792]: I0318 16:08:30.322515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:08:30 crc kubenswrapper[4792]: I0318 16:08:30.323589 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"307ae670042f37d1c86800fa368bad970319cab3747c90e3d5fdc192ca393de6"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:08:30 crc kubenswrapper[4792]: I0318 16:08:30.323654 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://307ae670042f37d1c86800fa368bad970319cab3747c90e3d5fdc192ca393de6" gracePeriod=600 Mar 18 16:08:31 crc kubenswrapper[4792]: I0318 16:08:31.191878 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="307ae670042f37d1c86800fa368bad970319cab3747c90e3d5fdc192ca393de6" exitCode=0 Mar 18 16:08:31 crc kubenswrapper[4792]: I0318 16:08:31.191937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"307ae670042f37d1c86800fa368bad970319cab3747c90e3d5fdc192ca393de6"} Mar 18 16:08:31 crc kubenswrapper[4792]: I0318 16:08:31.192451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27"} Mar 18 16:08:31 crc kubenswrapper[4792]: I0318 16:08:31.192472 4792 scope.go:117] "RemoveContainer" containerID="765fb60a99637aecdb451ae0902a9b780ef8338acc954ec4a66546467003467e" Mar 18 16:08:33 crc kubenswrapper[4792]: I0318 16:08:33.036427 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jjwfp"] Mar 18 16:08:33 crc kubenswrapper[4792]: I0318 16:08:33.053169 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jjwfp"] Mar 18 16:08:33 crc kubenswrapper[4792]: I0318 16:08:33.869364 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a148250-156e-4b40-969e-3569cea8a403" path="/var/lib/kubelet/pods/6a148250-156e-4b40-969e-3569cea8a403/volumes" Mar 18 16:08:34 crc kubenswrapper[4792]: I0318 16:08:34.031445 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nzh9d"] Mar 18 16:08:34 crc kubenswrapper[4792]: I0318 16:08:34.046199 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nzh9d"] Mar 18 16:08:35 crc kubenswrapper[4792]: I0318 16:08:35.870487 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46820dcb-64bf-40d0-ba36-827e2937de58" path="/var/lib/kubelet/pods/46820dcb-64bf-40d0-ba36-827e2937de58/volumes" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.060184 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-k2l2s"] Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.075519 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-k2l2s"] Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.089675 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ncpfm"] Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.100552 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ncpfm"] Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.450160 4792 scope.go:117] "RemoveContainer" containerID="2fa8370ff2792528e01c0f49ffb6411dafef9e5ac1bf12e7f35dab1dd3e59640" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.511670 4792 scope.go:117] "RemoveContainer" containerID="59823f71f56a681e7eaa4b6e106c5069d39fdf7e592ab47bc1d91f1ef0778ed3" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.573137 4792 scope.go:117] "RemoveContainer" containerID="99d8b233280187e8ef8e4cf628af40a883b0a835e14403c0be363cbfb929da57" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.618319 4792 scope.go:117] "RemoveContainer" containerID="b876576145bb9ed1da5a5558aa758f8c55f817ca489ffd15940130cbf4a3a229" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.664545 4792 scope.go:117] "RemoveContainer" containerID="5524ac9f88d877ff2d5e17e2864d7fa1f95f89b4b6b117790ee5a210a64d8cc5" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.740280 4792 scope.go:117] "RemoveContainer" containerID="1babca15697a2cd9b118a6bc1af2ffccc9c677253ff1134da2c62b4b40f3eea3" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.817616 4792 scope.go:117] "RemoveContainer" containerID="4ebea08edaca8a4a6d08983b6f3f825649307662de043611af82327a1934832e" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.844422 4792 scope.go:117] "RemoveContainer" containerID="938cb6b02e3b47a9aba8104a14d29e0e6a3726f4126e5dd07b212afdf79a9deb" Mar 18 16:08:46 crc kubenswrapper[4792]: I0318 16:08:46.899434 4792 scope.go:117] "RemoveContainer" containerID="c9f0870267e670b6ce6bdd76570bef0e437e36b8cc0b28bec08ec5e5531395a3" Mar 18 16:08:47 crc kubenswrapper[4792]: I0318 16:08:47.870558 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4cf6d4-998f-445b-82ed-25b2b4670875" path="/var/lib/kubelet/pods/2f4cf6d4-998f-445b-82ed-25b2b4670875/volumes" Mar 18 16:08:47 crc kubenswrapper[4792]: I0318 16:08:47.872494 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6967171c-e427-4723-ae1e-25e3bad61d59" path="/var/lib/kubelet/pods/6967171c-e427-4723-ae1e-25e3bad61d59/volumes" Mar 18 16:09:01 crc kubenswrapper[4792]: I0318 16:09:01.526522 4792 generic.go:334] "Generic (PLEG): container finished" podID="39d160c7-decc-4473-9c4f-f1282a927485" containerID="9b0176a2981f001922fd253d5f6006a7db05616e9dd5451132b9076d10643b59" exitCode=0 Mar 18 16:09:01 crc kubenswrapper[4792]: I0318 16:09:01.526644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" event={"ID":"39d160c7-decc-4473-9c4f-f1282a927485","Type":"ContainerDied","Data":"9b0176a2981f001922fd253d5f6006a7db05616e9dd5451132b9076d10643b59"} Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.097767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.200764 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-inventory\") pod \"39d160c7-decc-4473-9c4f-f1282a927485\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.201090 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-ssh-key-openstack-edpm-ipam\") pod \"39d160c7-decc-4473-9c4f-f1282a927485\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.201299 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7hvg\" (UniqueName: \"kubernetes.io/projected/39d160c7-decc-4473-9c4f-f1282a927485-kube-api-access-z7hvg\") pod \"39d160c7-decc-4473-9c4f-f1282a927485\" (UID: \"39d160c7-decc-4473-9c4f-f1282a927485\") " Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.213829 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d160c7-decc-4473-9c4f-f1282a927485-kube-api-access-z7hvg" (OuterVolumeSpecName: "kube-api-access-z7hvg") pod "39d160c7-decc-4473-9c4f-f1282a927485" (UID: "39d160c7-decc-4473-9c4f-f1282a927485"). InnerVolumeSpecName "kube-api-access-z7hvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.234263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-inventory" (OuterVolumeSpecName: "inventory") pod "39d160c7-decc-4473-9c4f-f1282a927485" (UID: "39d160c7-decc-4473-9c4f-f1282a927485"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.246537 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "39d160c7-decc-4473-9c4f-f1282a927485" (UID: "39d160c7-decc-4473-9c4f-f1282a927485"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.304939 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7hvg\" (UniqueName: \"kubernetes.io/projected/39d160c7-decc-4473-9c4f-f1282a927485-kube-api-access-z7hvg\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.304985 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.304999 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39d160c7-decc-4473-9c4f-f1282a927485-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.552042 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" event={"ID":"39d160c7-decc-4473-9c4f-f1282a927485","Type":"ContainerDied","Data":"0cb13e74f75ca807a69ef3afecd724a59ee13cbc8bfe75ba98e1ba9f571f44f9"} Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.552093 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cb13e74f75ca807a69ef3afecd724a59ee13cbc8bfe75ba98e1ba9f571f44f9" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.552267 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.640308 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h"] Mar 18 16:09:03 crc kubenswrapper[4792]: E0318 16:09:03.641198 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c867cce4-ccfb-4dc8-baed-fb506ed4d909" containerName="oc" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.641290 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c867cce4-ccfb-4dc8-baed-fb506ed4d909" containerName="oc" Mar 18 16:09:03 crc kubenswrapper[4792]: E0318 16:09:03.641387 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d160c7-decc-4473-9c4f-f1282a927485" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.641459 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d160c7-decc-4473-9c4f-f1282a927485" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.641821 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c867cce4-ccfb-4dc8-baed-fb506ed4d909" containerName="oc" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.641933 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d160c7-decc-4473-9c4f-f1282a927485" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.643264 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.645763 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.646004 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.646191 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.646706 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.653831 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h"] Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.719639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.719739 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.719927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44xp\" (UniqueName: \"kubernetes.io/projected/e5814716-d18e-49c1-8543-b99e741df9d9-kube-api-access-w44xp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.822274 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.822339 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.822436 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w44xp\" (UniqueName: \"kubernetes.io/projected/e5814716-d18e-49c1-8543-b99e741df9d9-kube-api-access-w44xp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.857245 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44xp\" (UniqueName: \"kubernetes.io/projected/e5814716-d18e-49c1-8543-b99e741df9d9-kube-api-access-w44xp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.862499 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.863028 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t584h\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:03 crc kubenswrapper[4792]: I0318 16:09:03.983799 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:09:04 crc kubenswrapper[4792]: I0318 16:09:04.564477 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h"] Mar 18 16:09:05 crc kubenswrapper[4792]: I0318 16:09:05.575405 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" event={"ID":"e5814716-d18e-49c1-8543-b99e741df9d9","Type":"ContainerStarted","Data":"ebd5ef402dbb9782345f62efd94e2025882f5634a68329a163214bca9e4472a4"} Mar 18 16:09:05 crc kubenswrapper[4792]: I0318 16:09:05.576066 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" event={"ID":"e5814716-d18e-49c1-8543-b99e741df9d9","Type":"ContainerStarted","Data":"06ab449165afa79ae70f6a48a3805cc02f7cbfae0f1f5ab3cb072f1806ad18e3"} Mar 18 16:09:05 crc kubenswrapper[4792]: I0318 16:09:05.615280 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" podStartSLOduration=2.182445318 podStartE2EDuration="2.615236388s" podCreationTimestamp="2026-03-18 16:09:03 +0000 UTC" firstStartedPulling="2026-03-18 16:09:04.555442634 +0000 UTC m=+2093.424771571" lastFinishedPulling="2026-03-18 16:09:04.988233704 +0000 UTC m=+2093.857562641" observedRunningTime="2026-03-18 16:09:05.594101583 +0000 UTC m=+2094.463430540" watchObservedRunningTime="2026-03-18 16:09:05.615236388 +0000 UTC m=+2094.484565315" Mar 18 16:09:33 crc kubenswrapper[4792]: I0318 16:09:33.045650 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-76vww"] Mar 18 16:09:33 crc kubenswrapper[4792]: I0318 16:09:33.056890 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-76vww"] Mar 18 16:09:33 crc kubenswrapper[4792]: I0318 16:09:33.867464 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971cd0c7-f215-44c4-bb0f-8930af5c49c5" path="/var/lib/kubelet/pods/971cd0c7-f215-44c4-bb0f-8930af5c49c5/volumes" Mar 18 16:09:34 crc kubenswrapper[4792]: I0318 16:09:34.053038 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3c34-account-create-update-r5zbr"] Mar 18 16:09:34 crc kubenswrapper[4792]: I0318 16:09:34.064886 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3c34-account-create-update-r5zbr"] Mar 18 16:09:34 crc kubenswrapper[4792]: I0318 16:09:34.076633 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e36b-account-create-update-mkc5t"] Mar 18 16:09:34 crc kubenswrapper[4792]: I0318 16:09:34.087550 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e36b-account-create-update-mkc5t"] Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.037342 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mgrlp"] Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.048802 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88f3-account-create-update-2s8l9"] Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.059270 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mgrlp"] Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.071926 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7llzh"] Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.082123 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-88f3-account-create-update-2s8l9"] Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.091596 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7llzh"] Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.869206 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f56bb52-48fd-427b-9524-2074c22df4b0" path="/var/lib/kubelet/pods/1f56bb52-48fd-427b-9524-2074c22df4b0/volumes" Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.871070 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292e5bc9-bca8-481a-99a8-512e17be912f" path="/var/lib/kubelet/pods/292e5bc9-bca8-481a-99a8-512e17be912f/volumes" Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.873574 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f349af-14ee-488d-ab71-3be43d3950ce" path="/var/lib/kubelet/pods/31f349af-14ee-488d-ab71-3be43d3950ce/volumes" Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.874441 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39938593-0553-4883-976f-d412c79c5357" path="/var/lib/kubelet/pods/39938593-0553-4883-976f-d412c79c5357/volumes" Mar 18 16:09:35 crc kubenswrapper[4792]: I0318 16:09:35.876177 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb2ab4d-b4cd-4ada-9d14-70d845630eba" path="/var/lib/kubelet/pods/cfb2ab4d-b4cd-4ada-9d14-70d845630eba/volumes" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.243268 4792 scope.go:117] "RemoveContainer" containerID="34adbb44391dcc67203b790c1e5370e397aebdbbeda2498d2118d425de02ed02" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.277791 4792 scope.go:117] "RemoveContainer" containerID="af40697eace1857ccb6cd8dbdb5ab5a314d44de4eab943998d393beda405dfd8" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.378828 4792 scope.go:117] "RemoveContainer" containerID="7cb3bd3b1a557c965a86d2d607d301f23df95f4ecc58fadafc0fe7bceacbf9e6" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.439091 4792 scope.go:117] "RemoveContainer" containerID="cbfd6fd049430a5385b4ef7fbdd7f9e73fe1a3da8ce0127d46b938f01729d8e8" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.508242 4792 scope.go:117] "RemoveContainer" containerID="c7c8c1f8ff9a63368d44b46282339ef63381908aeb1144e538578474823bc61e" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.561085 4792 scope.go:117] "RemoveContainer" containerID="c295e5f09de2193dca4d026669e6ca060797e588a59a67ab1332a95d0a7f8b47" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.618875 4792 scope.go:117] "RemoveContainer" containerID="38d4f0b33dd4c7c042ecfcb2c172e3ec32a5ca02551c744e047d927ce6862d7f" Mar 18 16:09:47 crc kubenswrapper[4792]: I0318 16:09:47.642624 4792 scope.go:117] "RemoveContainer" containerID="379da200892de03b2fa0a80240e5ab28e591cd78074fd5ef469c363ae8e0e4ce" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.008840 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4t4j4"] Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.012505 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.020314 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4t4j4"] Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.080401 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-catalog-content\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.080632 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzpx\" (UniqueName: \"kubernetes.io/projected/f1d558b7-3e50-46fb-b7b8-269c00392479-kube-api-access-qkzpx\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.080729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-utilities\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.182929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-catalog-content\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.183094 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzpx\" (UniqueName: \"kubernetes.io/projected/f1d558b7-3e50-46fb-b7b8-269c00392479-kube-api-access-qkzpx\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.183139 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-utilities\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.183880 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-catalog-content\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.184005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-utilities\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.211818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzpx\" (UniqueName: \"kubernetes.io/projected/f1d558b7-3e50-46fb-b7b8-269c00392479-kube-api-access-qkzpx\") pod \"redhat-operators-4t4j4\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.340377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:09:51 crc kubenswrapper[4792]: W0318 16:09:51.829521 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5 WatchSource:0}: Error finding container 498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5: Status 404 returned error can't find the container with id 498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5 Mar 18 16:09:51 crc kubenswrapper[4792]: I0318 16:09:51.829719 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4t4j4"] Mar 18 16:09:52 crc kubenswrapper[4792]: I0318 16:09:52.215083 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerID="0c7680dc29e7ad6fc88180e236055e23537c722aa649d820e62cda547e1199ec" exitCode=0 Mar 18 16:09:52 crc kubenswrapper[4792]: I0318 16:09:52.215299 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t4j4" event={"ID":"f1d558b7-3e50-46fb-b7b8-269c00392479","Type":"ContainerDied","Data":"0c7680dc29e7ad6fc88180e236055e23537c722aa649d820e62cda547e1199ec"} Mar 18 16:09:52 crc kubenswrapper[4792]: I0318 16:09:52.215405 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t4j4" event={"ID":"f1d558b7-3e50-46fb-b7b8-269c00392479","Type":"ContainerStarted","Data":"498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5"} Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.153649 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564170-2kb6g"] Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.156739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-2kb6g" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.161293 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.161364 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.161757 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.165803 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-2kb6g"] Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.278764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v899\" (UniqueName: \"kubernetes.io/projected/22cbddd2-7778-4435-983d-cec16396df53-kube-api-access-9v899\") pod \"auto-csr-approver-29564170-2kb6g\" (UID: \"22cbddd2-7778-4435-983d-cec16396df53\") " pod="openshift-infra/auto-csr-approver-29564170-2kb6g" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.381987 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v899\" (UniqueName: \"kubernetes.io/projected/22cbddd2-7778-4435-983d-cec16396df53-kube-api-access-9v899\") pod \"auto-csr-approver-29564170-2kb6g\" (UID: \"22cbddd2-7778-4435-983d-cec16396df53\") " pod="openshift-infra/auto-csr-approver-29564170-2kb6g" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.419877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v899\" (UniqueName: \"kubernetes.io/projected/22cbddd2-7778-4435-983d-cec16396df53-kube-api-access-9v899\") pod \"auto-csr-approver-29564170-2kb6g\" (UID: \"22cbddd2-7778-4435-983d-cec16396df53\") " pod="openshift-infra/auto-csr-approver-29564170-2kb6g" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.494237 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-2kb6g" Mar 18 16:10:00 crc kubenswrapper[4792]: I0318 16:10:00.982430 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-2kb6g"] Mar 18 16:10:00 crc kubenswrapper[4792]: W0318 16:10:00.992505 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22cbddd2_7778_4435_983d_cec16396df53.slice/crio-0ec04b9aa42194694d36662b48abc1481b1524aa9c5e8beea37078cd873e2f0b WatchSource:0}: Error finding container 0ec04b9aa42194694d36662b48abc1481b1524aa9c5e8beea37078cd873e2f0b: Status 404 returned error can't find the container with id 0ec04b9aa42194694d36662b48abc1481b1524aa9c5e8beea37078cd873e2f0b Mar 18 16:10:01 crc kubenswrapper[4792]: I0318 16:10:01.030606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-2kb6g" event={"ID":"22cbddd2-7778-4435-983d-cec16396df53","Type":"ContainerStarted","Data":"0ec04b9aa42194694d36662b48abc1481b1524aa9c5e8beea37078cd873e2f0b"} Mar 18 16:10:11 crc kubenswrapper[4792]: I0318 16:10:11.195590 4792 generic.go:334] "Generic (PLEG): container finished" podID="22cbddd2-7778-4435-983d-cec16396df53" containerID="7b531196521034eb6e3534d78385fd50ca336da9e96d216ddbac3e53adca1991" exitCode=0 Mar 18 16:10:11 crc kubenswrapper[4792]: I0318 16:10:11.196287 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-2kb6g" event={"ID":"22cbddd2-7778-4435-983d-cec16396df53","Type":"ContainerDied","Data":"7b531196521034eb6e3534d78385fd50ca336da9e96d216ddbac3e53adca1991"} Mar 18 16:10:11 crc kubenswrapper[4792]: I0318 16:10:11.200505 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t4j4" event={"ID":"f1d558b7-3e50-46fb-b7b8-269c00392479","Type":"ContainerStarted","Data":"cf62489ac4be7e3a63f115eac4e25263b7b85c71892d0b1bf0c757d655691ca4"} Mar 18 16:10:12 crc kubenswrapper[4792]: I0318 16:10:12.667402 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-2kb6g" Mar 18 16:10:12 crc kubenswrapper[4792]: I0318 16:10:12.854079 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v899\" (UniqueName: \"kubernetes.io/projected/22cbddd2-7778-4435-983d-cec16396df53-kube-api-access-9v899\") pod \"22cbddd2-7778-4435-983d-cec16396df53\" (UID: \"22cbddd2-7778-4435-983d-cec16396df53\") " Mar 18 16:10:12 crc kubenswrapper[4792]: I0318 16:10:12.995996 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22cbddd2-7778-4435-983d-cec16396df53-kube-api-access-9v899" (OuterVolumeSpecName: "kube-api-access-9v899") pod "22cbddd2-7778-4435-983d-cec16396df53" (UID: "22cbddd2-7778-4435-983d-cec16396df53"). InnerVolumeSpecName "kube-api-access-9v899". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:13 crc kubenswrapper[4792]: I0318 16:10:13.060312 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v899\" (UniqueName: \"kubernetes.io/projected/22cbddd2-7778-4435-983d-cec16396df53-kube-api-access-9v899\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:13 crc kubenswrapper[4792]: I0318 16:10:13.226224 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564170-2kb6g" event={"ID":"22cbddd2-7778-4435-983d-cec16396df53","Type":"ContainerDied","Data":"0ec04b9aa42194694d36662b48abc1481b1524aa9c5e8beea37078cd873e2f0b"} Mar 18 16:10:13 crc kubenswrapper[4792]: I0318 16:10:13.226265 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec04b9aa42194694d36662b48abc1481b1524aa9c5e8beea37078cd873e2f0b" Mar 18 16:10:13 crc kubenswrapper[4792]: I0318 16:10:13.226284 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564170-2kb6g" Mar 18 16:10:13 crc kubenswrapper[4792]: I0318 16:10:13.736385 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-84j7r"] Mar 18 16:10:13 crc kubenswrapper[4792]: I0318 16:10:13.747464 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564164-84j7r"] Mar 18 16:10:13 crc kubenswrapper[4792]: I0318 16:10:13.876657 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67989f84-d392-47e7-8358-37830b7dcaee" path="/var/lib/kubelet/pods/67989f84-d392-47e7-8358-37830b7dcaee/volumes" Mar 18 16:10:14 crc kubenswrapper[4792]: I0318 16:10:14.263359 4792 generic.go:334] "Generic (PLEG): container finished" podID="e5814716-d18e-49c1-8543-b99e741df9d9" containerID="ebd5ef402dbb9782345f62efd94e2025882f5634a68329a163214bca9e4472a4" exitCode=0 Mar 18 16:10:14 crc kubenswrapper[4792]: I0318 16:10:14.263782 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" event={"ID":"e5814716-d18e-49c1-8543-b99e741df9d9","Type":"ContainerDied","Data":"ebd5ef402dbb9782345f62efd94e2025882f5634a68329a163214bca9e4472a4"} Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.277100 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerID="cf62489ac4be7e3a63f115eac4e25263b7b85c71892d0b1bf0c757d655691ca4" exitCode=0 Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.277200 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t4j4" event={"ID":"f1d558b7-3e50-46fb-b7b8-269c00392479","Type":"ContainerDied","Data":"cf62489ac4be7e3a63f115eac4e25263b7b85c71892d0b1bf0c757d655691ca4"} Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.529343 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tj96"] Mar 18 16:10:15 crc kubenswrapper[4792]: E0318 16:10:15.530257 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cbddd2-7778-4435-983d-cec16396df53" containerName="oc" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.530282 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cbddd2-7778-4435-983d-cec16396df53" containerName="oc" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.530624 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="22cbddd2-7778-4435-983d-cec16396df53" containerName="oc" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.532819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.549691 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tj96"] Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.632631 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-utilities\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.632718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-catalog-content\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.633321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tftgc\" (UniqueName: \"kubernetes.io/projected/32602e14-8ee6-43a9-bc8b-45c26933b29e-kube-api-access-tftgc\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.738988 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-catalog-content\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.739450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-catalog-content\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.739487 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tftgc\" (UniqueName: \"kubernetes.io/projected/32602e14-8ee6-43a9-bc8b-45c26933b29e-kube-api-access-tftgc\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.739598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-utilities\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.740075 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-utilities\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.774101 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tftgc\" (UniqueName: \"kubernetes.io/projected/32602e14-8ee6-43a9-bc8b-45c26933b29e-kube-api-access-tftgc\") pod \"certified-operators-6tj96\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:15 crc kubenswrapper[4792]: I0318 16:10:15.859543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.024564 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.155180 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-ssh-key-openstack-edpm-ipam\") pod \"e5814716-d18e-49c1-8543-b99e741df9d9\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.155793 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-inventory\") pod \"e5814716-d18e-49c1-8543-b99e741df9d9\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.156362 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w44xp\" (UniqueName: \"kubernetes.io/projected/e5814716-d18e-49c1-8543-b99e741df9d9-kube-api-access-w44xp\") pod \"e5814716-d18e-49c1-8543-b99e741df9d9\" (UID: \"e5814716-d18e-49c1-8543-b99e741df9d9\") " Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.172503 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5814716-d18e-49c1-8543-b99e741df9d9-kube-api-access-w44xp" (OuterVolumeSpecName: "kube-api-access-w44xp") pod "e5814716-d18e-49c1-8543-b99e741df9d9" (UID: "e5814716-d18e-49c1-8543-b99e741df9d9"). InnerVolumeSpecName "kube-api-access-w44xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.203370 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e5814716-d18e-49c1-8543-b99e741df9d9" (UID: "e5814716-d18e-49c1-8543-b99e741df9d9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.220735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-inventory" (OuterVolumeSpecName: "inventory") pod "e5814716-d18e-49c1-8543-b99e741df9d9" (UID: "e5814716-d18e-49c1-8543-b99e741df9d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.262018 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w44xp\" (UniqueName: \"kubernetes.io/projected/e5814716-d18e-49c1-8543-b99e741df9d9-kube-api-access-w44xp\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.262060 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.262074 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5814716-d18e-49c1-8543-b99e741df9d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.290944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t4j4" event={"ID":"f1d558b7-3e50-46fb-b7b8-269c00392479","Type":"ContainerStarted","Data":"d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d"} Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.293567 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" event={"ID":"e5814716-d18e-49c1-8543-b99e741df9d9","Type":"ContainerDied","Data":"06ab449165afa79ae70f6a48a3805cc02f7cbfae0f1f5ab3cb072f1806ad18e3"} Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.293703 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06ab449165afa79ae70f6a48a3805cc02f7cbfae0f1f5ab3cb072f1806ad18e3" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.293604 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t584h" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.314144 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4t4j4" podStartSLOduration=2.691017325 podStartE2EDuration="26.314121748s" podCreationTimestamp="2026-03-18 16:09:50 +0000 UTC" firstStartedPulling="2026-03-18 16:09:52.216954503 +0000 UTC m=+2141.086283440" lastFinishedPulling="2026-03-18 16:10:15.840058926 +0000 UTC m=+2164.709387863" observedRunningTime="2026-03-18 16:10:16.313750757 +0000 UTC m=+2165.183079704" watchObservedRunningTime="2026-03-18 16:10:16.314121748 +0000 UTC m=+2165.183450685" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.392450 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph"] Mar 18 16:10:16 crc kubenswrapper[4792]: E0318 16:10:16.393098 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5814716-d18e-49c1-8543-b99e741df9d9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.393126 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5814716-d18e-49c1-8543-b99e741df9d9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.393438 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5814716-d18e-49c1-8543-b99e741df9d9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.394509 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.402518 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.402589 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.402776 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.402944 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.405158 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph"] Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.468208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tj96"] Mar 18 16:10:16 crc kubenswrapper[4792]: W0318 16:10:16.479866 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32602e14_8ee6_43a9_bc8b_45c26933b29e.slice/crio-293914060a541f57484c36c050d859e5003d2b10be612ba6476e85ce7325925d WatchSource:0}: Error finding container 293914060a541f57484c36c050d859e5003d2b10be612ba6476e85ce7325925d: Status 404 returned error can't find the container with id 293914060a541f57484c36c050d859e5003d2b10be612ba6476e85ce7325925d Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.572662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9pcp\" (UniqueName: \"kubernetes.io/projected/be9de932-130e-4207-bc36-6c6527e63684-kube-api-access-d9pcp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.573103 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.573446 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.676428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.676758 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.676948 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9pcp\" (UniqueName: \"kubernetes.io/projected/be9de932-130e-4207-bc36-6c6527e63684-kube-api-access-d9pcp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.682875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.682952 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.703852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9pcp\" (UniqueName: \"kubernetes.io/projected/be9de932-130e-4207-bc36-6c6527e63684-kube-api-access-d9pcp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:16 crc kubenswrapper[4792]: I0318 16:10:16.722026 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:17 crc kubenswrapper[4792]: I0318 16:10:17.305473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph"] Mar 18 16:10:17 crc kubenswrapper[4792]: I0318 16:10:17.314419 4792 generic.go:334] "Generic (PLEG): container finished" podID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerID="5c8e5f98c1ec1eb3ff88b84c2f2f0d16ba62b9f7834e189132effe0ef5626d2b" exitCode=0 Mar 18 16:10:17 crc kubenswrapper[4792]: I0318 16:10:17.314484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tj96" event={"ID":"32602e14-8ee6-43a9-bc8b-45c26933b29e","Type":"ContainerDied","Data":"5c8e5f98c1ec1eb3ff88b84c2f2f0d16ba62b9f7834e189132effe0ef5626d2b"} Mar 18 16:10:17 crc kubenswrapper[4792]: I0318 16:10:17.314512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tj96" event={"ID":"32602e14-8ee6-43a9-bc8b-45c26933b29e","Type":"ContainerStarted","Data":"293914060a541f57484c36c050d859e5003d2b10be612ba6476e85ce7325925d"} Mar 18 16:10:18 crc kubenswrapper[4792]: I0318 16:10:18.033226 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jjzdm"] Mar 18 16:10:18 crc kubenswrapper[4792]: I0318 16:10:18.046825 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jjzdm"] Mar 18 16:10:18 crc kubenswrapper[4792]: I0318 16:10:18.325729 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" event={"ID":"be9de932-130e-4207-bc36-6c6527e63684","Type":"ContainerStarted","Data":"968d67c52e857705665c3dd824c771a75dd426c73055ee939a1395617abf995b"} Mar 18 16:10:18 crc kubenswrapper[4792]: I0318 16:10:18.325777 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" event={"ID":"be9de932-130e-4207-bc36-6c6527e63684","Type":"ContainerStarted","Data":"d72ac8bd0199a2e11a9be6f0f433440b41d289631ac35ff643d25f920c35b91f"} Mar 18 16:10:18 crc kubenswrapper[4792]: I0318 16:10:18.347533 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" podStartSLOduration=1.921526826 podStartE2EDuration="2.347511318s" podCreationTimestamp="2026-03-18 16:10:16 +0000 UTC" firstStartedPulling="2026-03-18 16:10:17.314706619 +0000 UTC m=+2166.184035556" lastFinishedPulling="2026-03-18 16:10:17.740691111 +0000 UTC m=+2166.610020048" observedRunningTime="2026-03-18 16:10:18.33831617 +0000 UTC m=+2167.207645107" watchObservedRunningTime="2026-03-18 16:10:18.347511318 +0000 UTC m=+2167.216840255" Mar 18 16:10:19 crc kubenswrapper[4792]: I0318 16:10:19.339031 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tj96" event={"ID":"32602e14-8ee6-43a9-bc8b-45c26933b29e","Type":"ContainerStarted","Data":"d635af5c10ffd5752aae40afc4466b4e78badd0aac687d22970a37a93b4aeeb9"} Mar 18 16:10:19 crc kubenswrapper[4792]: I0318 16:10:19.885769 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29554117-39ac-4a2b-bd31-4d6858fb7931" path="/var/lib/kubelet/pods/29554117-39ac-4a2b-bd31-4d6858fb7931/volumes" Mar 18 16:10:21 crc kubenswrapper[4792]: I0318 16:10:21.342295 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:10:21 crc kubenswrapper[4792]: I0318 16:10:21.342681 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:10:21 crc kubenswrapper[4792]: I0318 16:10:21.363103 4792 generic.go:334] "Generic (PLEG): container finished" podID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerID="d635af5c10ffd5752aae40afc4466b4e78badd0aac687d22970a37a93b4aeeb9" exitCode=0 Mar 18 16:10:21 crc kubenswrapper[4792]: I0318 16:10:21.363149 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tj96" event={"ID":"32602e14-8ee6-43a9-bc8b-45c26933b29e","Type":"ContainerDied","Data":"d635af5c10ffd5752aae40afc4466b4e78badd0aac687d22970a37a93b4aeeb9"} Mar 18 16:10:22 crc kubenswrapper[4792]: I0318 16:10:22.386561 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tj96" event={"ID":"32602e14-8ee6-43a9-bc8b-45c26933b29e","Type":"ContainerStarted","Data":"855f4634b1de0ebf8765f583690f879c7e6d41070070bff03019b2b4ae664a45"} Mar 18 16:10:22 crc kubenswrapper[4792]: I0318 16:10:22.410701 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4t4j4" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="registry-server" probeResult="failure" output=< Mar 18 16:10:22 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:10:22 crc kubenswrapper[4792]: > Mar 18 16:10:22 crc kubenswrapper[4792]: I0318 16:10:22.442152 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tj96" podStartSLOduration=2.739108005 podStartE2EDuration="7.442128609s" podCreationTimestamp="2026-03-18 16:10:15 +0000 UTC" firstStartedPulling="2026-03-18 16:10:17.331270081 +0000 UTC m=+2166.200599008" lastFinishedPulling="2026-03-18 16:10:22.034290675 +0000 UTC m=+2170.903619612" observedRunningTime="2026-03-18 16:10:22.433258751 +0000 UTC m=+2171.302587708" watchObservedRunningTime="2026-03-18 16:10:22.442128609 +0000 UTC m=+2171.311457546" Mar 18 16:10:23 crc kubenswrapper[4792]: I0318 16:10:23.398308 4792 generic.go:334] "Generic (PLEG): container finished" podID="be9de932-130e-4207-bc36-6c6527e63684" containerID="968d67c52e857705665c3dd824c771a75dd426c73055ee939a1395617abf995b" exitCode=0 Mar 18 16:10:23 crc kubenswrapper[4792]: I0318 16:10:23.398394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" event={"ID":"be9de932-130e-4207-bc36-6c6527e63684","Type":"ContainerDied","Data":"968d67c52e857705665c3dd824c771a75dd426c73055ee939a1395617abf995b"} Mar 18 16:10:24 crc kubenswrapper[4792]: I0318 16:10:24.914197 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.085906 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9pcp\" (UniqueName: \"kubernetes.io/projected/be9de932-130e-4207-bc36-6c6527e63684-kube-api-access-d9pcp\") pod \"be9de932-130e-4207-bc36-6c6527e63684\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.086365 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-inventory\") pod \"be9de932-130e-4207-bc36-6c6527e63684\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.086405 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-ssh-key-openstack-edpm-ipam\") pod \"be9de932-130e-4207-bc36-6c6527e63684\" (UID: \"be9de932-130e-4207-bc36-6c6527e63684\") " Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.091581 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9de932-130e-4207-bc36-6c6527e63684-kube-api-access-d9pcp" (OuterVolumeSpecName: "kube-api-access-d9pcp") pod "be9de932-130e-4207-bc36-6c6527e63684" (UID: "be9de932-130e-4207-bc36-6c6527e63684"). InnerVolumeSpecName "kube-api-access-d9pcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.119895 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be9de932-130e-4207-bc36-6c6527e63684" (UID: "be9de932-130e-4207-bc36-6c6527e63684"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.129941 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-inventory" (OuterVolumeSpecName: "inventory") pod "be9de932-130e-4207-bc36-6c6527e63684" (UID: "be9de932-130e-4207-bc36-6c6527e63684"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.189370 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.189402 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9de932-130e-4207-bc36-6c6527e63684-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.189415 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9pcp\" (UniqueName: \"kubernetes.io/projected/be9de932-130e-4207-bc36-6c6527e63684-kube-api-access-d9pcp\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.417673 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" event={"ID":"be9de932-130e-4207-bc36-6c6527e63684","Type":"ContainerDied","Data":"d72ac8bd0199a2e11a9be6f0f433440b41d289631ac35ff643d25f920c35b91f"} Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.417719 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d72ac8bd0199a2e11a9be6f0f433440b41d289631ac35ff643d25f920c35b91f" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.417726 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.581704 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw"] Mar 18 16:10:25 crc kubenswrapper[4792]: E0318 16:10:25.582336 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9de932-130e-4207-bc36-6c6527e63684" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.582355 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9de932-130e-4207-bc36-6c6527e63684" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.582837 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9de932-130e-4207-bc36-6c6527e63684" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.583953 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.586139 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.586174 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.587211 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.589852 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.596406 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw"] Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.702486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.702672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-kube-api-access-gvs5j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.703127 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.805141 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.805467 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.805530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-kube-api-access-gvs5j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.810189 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.811602 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.824777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-kube-api-access-gvs5j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4dqtw\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.869054 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.869111 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:25 crc kubenswrapper[4792]: I0318 16:10:25.908837 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:10:26 crc kubenswrapper[4792]: I0318 16:10:26.604705 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw"] Mar 18 16:10:26 crc kubenswrapper[4792]: W0318 16:10:26.605727 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea475c80_81c5_4bb6_937f_4a2a87d6d9e7.slice/crio-d98c8cb1117e1276d1d39f9d9c84d7fbda6e9198e3c0bfee7c9c0ff934a4f00e WatchSource:0}: Error finding container d98c8cb1117e1276d1d39f9d9c84d7fbda6e9198e3c0bfee7c9c0ff934a4f00e: Status 404 returned error can't find the container with id d98c8cb1117e1276d1d39f9d9c84d7fbda6e9198e3c0bfee7c9c0ff934a4f00e Mar 18 16:10:26 crc kubenswrapper[4792]: I0318 16:10:26.941301 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6tj96" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="registry-server" probeResult="failure" output=< Mar 18 16:10:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:10:26 crc kubenswrapper[4792]: > Mar 18 16:10:27 crc kubenswrapper[4792]: I0318 16:10:27.440802 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" event={"ID":"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7","Type":"ContainerStarted","Data":"ccd32f17a1c6b12fe73495337b4c0df4b7d185ed9227935c6c59e39bb7eccc94"} Mar 18 16:10:27 crc kubenswrapper[4792]: I0318 16:10:27.441168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" event={"ID":"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7","Type":"ContainerStarted","Data":"d98c8cb1117e1276d1d39f9d9c84d7fbda6e9198e3c0bfee7c9c0ff934a4f00e"} Mar 18 16:10:27 crc kubenswrapper[4792]: I0318 16:10:27.468451 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" podStartSLOduration=2.021483321 podStartE2EDuration="2.468426914s" podCreationTimestamp="2026-03-18 16:10:25 +0000 UTC" firstStartedPulling="2026-03-18 16:10:26.60857789 +0000 UTC m=+2175.477906827" lastFinishedPulling="2026-03-18 16:10:27.055521483 +0000 UTC m=+2175.924850420" observedRunningTime="2026-03-18 16:10:27.456390344 +0000 UTC m=+2176.325719291" watchObservedRunningTime="2026-03-18 16:10:27.468426914 +0000 UTC m=+2176.337755851" Mar 18 16:10:30 crc kubenswrapper[4792]: I0318 16:10:30.321739 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:10:30 crc kubenswrapper[4792]: I0318 16:10:30.322338 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:10:31 crc kubenswrapper[4792]: I0318 16:10:31.395851 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:10:31 crc kubenswrapper[4792]: I0318 16:10:31.450789 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:10:31 crc kubenswrapper[4792]: I0318 16:10:31.541591 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4t4j4"] Mar 18 16:10:31 crc kubenswrapper[4792]: I0318 16:10:31.649296 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxbbw"] Mar 18 16:10:31 crc kubenswrapper[4792]: I0318 16:10:31.649574 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sxbbw" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="registry-server" containerID="cri-o://0077bb29acb92306864d5545a151040ccb7524ded67ca47dc7a3f12787988ba5" gracePeriod=2 Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.554204 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sxbbw_283429bb-089d-4683-a2a9-581e26af8a6a/registry-server/0.log" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.556315 4792 generic.go:334] "Generic (PLEG): container finished" podID="283429bb-089d-4683-a2a9-581e26af8a6a" containerID="0077bb29acb92306864d5545a151040ccb7524ded67ca47dc7a3f12787988ba5" exitCode=137 Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.556355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerDied","Data":"0077bb29acb92306864d5545a151040ccb7524ded67ca47dc7a3f12787988ba5"} Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.556384 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxbbw" event={"ID":"283429bb-089d-4683-a2a9-581e26af8a6a","Type":"ContainerDied","Data":"b44dc2909c75d4e3beb4936545624074d5a6152efbc11f62182df544bc404c5e"} Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.556399 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b44dc2909c75d4e3beb4936545624074d5a6152efbc11f62182df544bc404c5e" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.632798 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sxbbw_283429bb-089d-4683-a2a9-581e26af8a6a/registry-server/0.log" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.634092 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.746176 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-catalog-content\") pod \"283429bb-089d-4683-a2a9-581e26af8a6a\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.746294 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-utilities\") pod \"283429bb-089d-4683-a2a9-581e26af8a6a\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.746422 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m9tt\" (UniqueName: \"kubernetes.io/projected/283429bb-089d-4683-a2a9-581e26af8a6a-kube-api-access-2m9tt\") pod \"283429bb-089d-4683-a2a9-581e26af8a6a\" (UID: \"283429bb-089d-4683-a2a9-581e26af8a6a\") " Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.749200 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-utilities" (OuterVolumeSpecName: "utilities") pod "283429bb-089d-4683-a2a9-581e26af8a6a" (UID: "283429bb-089d-4683-a2a9-581e26af8a6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.759700 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/283429bb-089d-4683-a2a9-581e26af8a6a-kube-api-access-2m9tt" (OuterVolumeSpecName: "kube-api-access-2m9tt") pod "283429bb-089d-4683-a2a9-581e26af8a6a" (UID: "283429bb-089d-4683-a2a9-581e26af8a6a"). InnerVolumeSpecName "kube-api-access-2m9tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.851148 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.851188 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m9tt\" (UniqueName: \"kubernetes.io/projected/283429bb-089d-4683-a2a9-581e26af8a6a-kube-api-access-2m9tt\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.920800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "283429bb-089d-4683-a2a9-581e26af8a6a" (UID: "283429bb-089d-4683-a2a9-581e26af8a6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:10:34 crc kubenswrapper[4792]: I0318 16:10:34.953563 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/283429bb-089d-4683-a2a9-581e26af8a6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:35 crc kubenswrapper[4792]: I0318 16:10:35.567055 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxbbw" Mar 18 16:10:35 crc kubenswrapper[4792]: I0318 16:10:35.615463 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxbbw"] Mar 18 16:10:35 crc kubenswrapper[4792]: I0318 16:10:35.635674 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sxbbw"] Mar 18 16:10:35 crc kubenswrapper[4792]: I0318 16:10:35.892441 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" path="/var/lib/kubelet/pods/283429bb-089d-4683-a2a9-581e26af8a6a/volumes" Mar 18 16:10:35 crc kubenswrapper[4792]: I0318 16:10:35.937614 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:35 crc kubenswrapper[4792]: I0318 16:10:35.998801 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:37 crc kubenswrapper[4792]: I0318 16:10:37.845881 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tj96"] Mar 18 16:10:37 crc kubenswrapper[4792]: I0318 16:10:37.846418 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tj96" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="registry-server" containerID="cri-o://855f4634b1de0ebf8765f583690f879c7e6d41070070bff03019b2b4ae664a45" gracePeriod=2 Mar 18 16:10:38 crc kubenswrapper[4792]: I0318 16:10:38.598941 4792 generic.go:334] "Generic (PLEG): container finished" podID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerID="855f4634b1de0ebf8765f583690f879c7e6d41070070bff03019b2b4ae664a45" exitCode=0 Mar 18 16:10:38 crc kubenswrapper[4792]: I0318 16:10:38.599016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tj96" event={"ID":"32602e14-8ee6-43a9-bc8b-45c26933b29e","Type":"ContainerDied","Data":"855f4634b1de0ebf8765f583690f879c7e6d41070070bff03019b2b4ae664a45"} Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.024230 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.195004 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-catalog-content\") pod \"32602e14-8ee6-43a9-bc8b-45c26933b29e\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.195566 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-utilities\") pod \"32602e14-8ee6-43a9-bc8b-45c26933b29e\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.195850 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tftgc\" (UniqueName: \"kubernetes.io/projected/32602e14-8ee6-43a9-bc8b-45c26933b29e-kube-api-access-tftgc\") pod \"32602e14-8ee6-43a9-bc8b-45c26933b29e\" (UID: \"32602e14-8ee6-43a9-bc8b-45c26933b29e\") " Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.196401 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-utilities" (OuterVolumeSpecName: "utilities") pod "32602e14-8ee6-43a9-bc8b-45c26933b29e" (UID: "32602e14-8ee6-43a9-bc8b-45c26933b29e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.197076 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.204436 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32602e14-8ee6-43a9-bc8b-45c26933b29e-kube-api-access-tftgc" (OuterVolumeSpecName: "kube-api-access-tftgc") pod "32602e14-8ee6-43a9-bc8b-45c26933b29e" (UID: "32602e14-8ee6-43a9-bc8b-45c26933b29e"). InnerVolumeSpecName "kube-api-access-tftgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.252799 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32602e14-8ee6-43a9-bc8b-45c26933b29e" (UID: "32602e14-8ee6-43a9-bc8b-45c26933b29e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.299686 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tftgc\" (UniqueName: \"kubernetes.io/projected/32602e14-8ee6-43a9-bc8b-45c26933b29e-kube-api-access-tftgc\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.299725 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32602e14-8ee6-43a9-bc8b-45c26933b29e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.613249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tj96" event={"ID":"32602e14-8ee6-43a9-bc8b-45c26933b29e","Type":"ContainerDied","Data":"293914060a541f57484c36c050d859e5003d2b10be612ba6476e85ce7325925d"} Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.613310 4792 scope.go:117] "RemoveContainer" containerID="855f4634b1de0ebf8765f583690f879c7e6d41070070bff03019b2b4ae664a45" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.613317 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tj96" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.650044 4792 scope.go:117] "RemoveContainer" containerID="d635af5c10ffd5752aae40afc4466b4e78badd0aac687d22970a37a93b4aeeb9" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.652498 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tj96"] Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.666558 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tj96"] Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.677309 4792 scope.go:117] "RemoveContainer" containerID="5c8e5f98c1ec1eb3ff88b84c2f2f0d16ba62b9f7834e189132effe0ef5626d2b" Mar 18 16:10:39 crc kubenswrapper[4792]: I0318 16:10:39.873788 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" path="/var/lib/kubelet/pods/32602e14-8ee6-43a9-bc8b-45c26933b29e/volumes" Mar 18 16:10:47 crc kubenswrapper[4792]: I0318 16:10:47.855205 4792 scope.go:117] "RemoveContainer" containerID="9f6fece026c8d22796dd3944c910c620335f81930fd9b88315b465103fe800cd" Mar 18 16:10:47 crc kubenswrapper[4792]: I0318 16:10:47.919457 4792 scope.go:117] "RemoveContainer" containerID="0077bb29acb92306864d5545a151040ccb7524ded67ca47dc7a3f12787988ba5" Mar 18 16:10:47 crc kubenswrapper[4792]: I0318 16:10:47.965071 4792 scope.go:117] "RemoveContainer" containerID="9110fc60e1686b32186ba347c797f65e8a8ec417eeb79ae7dabc6926895c80bb" Mar 18 16:10:48 crc kubenswrapper[4792]: I0318 16:10:48.007337 4792 scope.go:117] "RemoveContainer" containerID="4e3f4c1a35e02ffa15d15a23f622cc4a603d0b0e539fb277895345da57f682c2" Mar 18 16:10:48 crc kubenswrapper[4792]: I0318 16:10:48.045388 4792 scope.go:117] "RemoveContainer" containerID="a9bb891cd1260870a7ea5683b1506cd23553ef73ce09e18168150c2bb71082bc" Mar 18 16:10:49 crc kubenswrapper[4792]: I0318 16:10:49.071298 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-d42vd"] Mar 18 16:10:49 crc kubenswrapper[4792]: I0318 16:10:49.084008 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-d42vd"] Mar 18 16:10:49 crc kubenswrapper[4792]: I0318 16:10:49.875438 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b243f42-c71b-4c56-817e-2345f5502ba6" path="/var/lib/kubelet/pods/5b243f42-c71b-4c56-817e-2345f5502ba6/volumes" Mar 18 16:10:56 crc kubenswrapper[4792]: I0318 16:10:56.032352 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qd24g"] Mar 18 16:10:56 crc kubenswrapper[4792]: I0318 16:10:56.044190 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qd24g"] Mar 18 16:10:57 crc kubenswrapper[4792]: I0318 16:10:57.030023 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-4wg26"] Mar 18 16:10:57 crc kubenswrapper[4792]: I0318 16:10:57.038908 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-cfe5-account-create-update-6tpq5"] Mar 18 16:10:57 crc kubenswrapper[4792]: I0318 16:10:57.063288 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-4wg26"] Mar 18 16:10:57 crc kubenswrapper[4792]: I0318 16:10:57.074435 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-cfe5-account-create-update-6tpq5"] Mar 18 16:10:57 crc kubenswrapper[4792]: I0318 16:10:57.867587 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a94a71-c815-4ad4-851b-fd5139b6561b" path="/var/lib/kubelet/pods/03a94a71-c815-4ad4-851b-fd5139b6561b/volumes" Mar 18 16:10:57 crc kubenswrapper[4792]: I0318 16:10:57.895278 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21421779-aad0-4455-bd2f-90c3344e6fe6" path="/var/lib/kubelet/pods/21421779-aad0-4455-bd2f-90c3344e6fe6/volumes" Mar 18 16:10:57 crc kubenswrapper[4792]: I0318 16:10:57.902196 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6d5ef0-0a5f-4a17-976d-f971ced7792e" path="/var/lib/kubelet/pods/ff6d5ef0-0a5f-4a17-976d-f971ced7792e/volumes" Mar 18 16:11:00 crc kubenswrapper[4792]: I0318 16:11:00.321788 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:11:00 crc kubenswrapper[4792]: I0318 16:11:00.322336 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:11:04 crc kubenswrapper[4792]: I0318 16:11:04.893253 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea475c80-81c5-4bb6-937f-4a2a87d6d9e7" containerID="ccd32f17a1c6b12fe73495337b4c0df4b7d185ed9227935c6c59e39bb7eccc94" exitCode=0 Mar 18 16:11:04 crc kubenswrapper[4792]: I0318 16:11:04.893709 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" event={"ID":"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7","Type":"ContainerDied","Data":"ccd32f17a1c6b12fe73495337b4c0df4b7d185ed9227935c6c59e39bb7eccc94"} Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.458241 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.610635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-ssh-key-openstack-edpm-ipam\") pod \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.610800 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-kube-api-access-gvs5j\") pod \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.610905 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-inventory\") pod \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\" (UID: \"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7\") " Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.615920 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-kube-api-access-gvs5j" (OuterVolumeSpecName: "kube-api-access-gvs5j") pod "ea475c80-81c5-4bb6-937f-4a2a87d6d9e7" (UID: "ea475c80-81c5-4bb6-937f-4a2a87d6d9e7"). InnerVolumeSpecName "kube-api-access-gvs5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.643261 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-inventory" (OuterVolumeSpecName: "inventory") pod "ea475c80-81c5-4bb6-937f-4a2a87d6d9e7" (UID: "ea475c80-81c5-4bb6-937f-4a2a87d6d9e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.644163 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ea475c80-81c5-4bb6-937f-4a2a87d6d9e7" (UID: "ea475c80-81c5-4bb6-937f-4a2a87d6d9e7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.714529 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.714557 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-kube-api-access-gvs5j\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.714583 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea475c80-81c5-4bb6-937f-4a2a87d6d9e7-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.922346 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" event={"ID":"ea475c80-81c5-4bb6-937f-4a2a87d6d9e7","Type":"ContainerDied","Data":"d98c8cb1117e1276d1d39f9d9c84d7fbda6e9198e3c0bfee7c9c0ff934a4f00e"} Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.922685 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d98c8cb1117e1276d1d39f9d9c84d7fbda6e9198e3c0bfee7c9c0ff934a4f00e" Mar 18 16:11:06 crc kubenswrapper[4792]: I0318 16:11:06.922615 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4dqtw" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.003428 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg"] Mar 18 16:11:07 crc kubenswrapper[4792]: E0318 16:11:07.003993 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea475c80-81c5-4bb6-937f-4a2a87d6d9e7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004020 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea475c80-81c5-4bb6-937f-4a2a87d6d9e7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:07 crc kubenswrapper[4792]: E0318 16:11:07.004045 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="extract-utilities" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004052 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="extract-utilities" Mar 18 16:11:07 crc kubenswrapper[4792]: E0318 16:11:07.004067 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="registry-server" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004073 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="registry-server" Mar 18 16:11:07 crc kubenswrapper[4792]: E0318 16:11:07.004090 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="registry-server" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004098 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="registry-server" Mar 18 16:11:07 crc kubenswrapper[4792]: E0318 16:11:07.004124 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="extract-utilities" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004132 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="extract-utilities" Mar 18 16:11:07 crc kubenswrapper[4792]: E0318 16:11:07.004159 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="extract-content" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004166 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="extract-content" Mar 18 16:11:07 crc kubenswrapper[4792]: E0318 16:11:07.004184 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="extract-content" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004191 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="extract-content" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004437 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea475c80-81c5-4bb6-937f-4a2a87d6d9e7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004457 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="283429bb-089d-4683-a2a9-581e26af8a6a" containerName="registry-server" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.004485 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="32602e14-8ee6-43a9-bc8b-45c26933b29e" containerName="registry-server" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.005507 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.008769 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.009033 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.009284 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.024054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg"] Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.025824 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.122708 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvtc\" (UniqueName: \"kubernetes.io/projected/7cb772fb-4950-4c27-b7c5-28ca75682b99-kube-api-access-xsvtc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.122828 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.122964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.224840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.224947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvtc\" (UniqueName: \"kubernetes.io/projected/7cb772fb-4950-4c27-b7c5-28ca75682b99-kube-api-access-xsvtc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.225040 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.231020 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.240137 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.243561 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvtc\" (UniqueName: \"kubernetes.io/projected/7cb772fb-4950-4c27-b7c5-28ca75682b99-kube-api-access-xsvtc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.328089 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.873943 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg"] Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.880682 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:11:07 crc kubenswrapper[4792]: I0318 16:11:07.934137 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" event={"ID":"7cb772fb-4950-4c27-b7c5-28ca75682b99","Type":"ContainerStarted","Data":"51bc7701f2c260d6d0d491c679b6919cb1043eb04654c413b800c75afd7056ad"} Mar 18 16:11:08 crc kubenswrapper[4792]: I0318 16:11:08.949095 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" event={"ID":"7cb772fb-4950-4c27-b7c5-28ca75682b99","Type":"ContainerStarted","Data":"6665e09e20cfc231eb3062bfb680689c6766c80b8515955b9d8af72f0c29b1df"} Mar 18 16:11:08 crc kubenswrapper[4792]: I0318 16:11:08.975311 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" podStartSLOduration=2.443939517 podStartE2EDuration="2.975294606s" podCreationTimestamp="2026-03-18 16:11:06 +0000 UTC" firstStartedPulling="2026-03-18 16:11:07.880326827 +0000 UTC m=+2216.749655764" lastFinishedPulling="2026-03-18 16:11:08.411681916 +0000 UTC m=+2217.281010853" observedRunningTime="2026-03-18 16:11:08.972056601 +0000 UTC m=+2217.841385548" watchObservedRunningTime="2026-03-18 16:11:08.975294606 +0000 UTC m=+2217.844623543" Mar 18 16:11:30 crc kubenswrapper[4792]: I0318 16:11:30.322338 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:11:30 crc kubenswrapper[4792]: I0318 16:11:30.322905 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:11:30 crc kubenswrapper[4792]: I0318 16:11:30.322956 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:11:30 crc kubenswrapper[4792]: I0318 16:11:30.324149 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:11:30 crc kubenswrapper[4792]: I0318 16:11:30.324230 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" gracePeriod=600 Mar 18 16:11:30 crc kubenswrapper[4792]: E0318 16:11:30.451165 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:11:31 crc kubenswrapper[4792]: I0318 16:11:31.175014 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" exitCode=0 Mar 18 16:11:31 crc kubenswrapper[4792]: I0318 16:11:31.175207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27"} Mar 18 16:11:31 crc kubenswrapper[4792]: I0318 16:11:31.175340 4792 scope.go:117] "RemoveContainer" containerID="307ae670042f37d1c86800fa368bad970319cab3747c90e3d5fdc192ca393de6" Mar 18 16:11:31 crc kubenswrapper[4792]: I0318 16:11:31.176234 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:11:31 crc kubenswrapper[4792]: E0318 16:11:31.176575 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:11:34 crc kubenswrapper[4792]: I0318 16:11:34.039425 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlvbw"] Mar 18 16:11:34 crc kubenswrapper[4792]: I0318 16:11:34.051251 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlvbw"] Mar 18 16:11:35 crc kubenswrapper[4792]: I0318 16:11:35.885784 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a77d6c0-2e1b-4b20-a323-8991335efabc" path="/var/lib/kubelet/pods/6a77d6c0-2e1b-4b20-a323-8991335efabc/volumes" Mar 18 16:11:36 crc kubenswrapper[4792]: I0318 16:11:36.844958 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjlm"] Mar 18 16:11:36 crc kubenswrapper[4792]: I0318 16:11:36.848519 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:36 crc kubenswrapper[4792]: I0318 16:11:36.854201 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjlm"] Mar 18 16:11:36 crc kubenswrapper[4792]: I0318 16:11:36.920144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-utilities\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:36 crc kubenswrapper[4792]: I0318 16:11:36.920270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/8a55f1c6-b7f6-46c6-9848-0c29ef363716-kube-api-access-5h75m\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:36 crc kubenswrapper[4792]: I0318 16:11:36.920490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-catalog-content\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.022379 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-catalog-content\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.022489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-utilities\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.022611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/8a55f1c6-b7f6-46c6-9848-0c29ef363716-kube-api-access-5h75m\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.022888 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-catalog-content\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.023328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-utilities\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.042934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/8a55f1c6-b7f6-46c6-9848-0c29ef363716-kube-api-access-5h75m\") pod \"redhat-marketplace-sbjlm\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.179119 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:37 crc kubenswrapper[4792]: I0318 16:11:37.654669 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjlm"] Mar 18 16:11:38 crc kubenswrapper[4792]: I0318 16:11:38.259436 4792 generic.go:334] "Generic (PLEG): container finished" podID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerID="2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79" exitCode=0 Mar 18 16:11:38 crc kubenswrapper[4792]: I0318 16:11:38.259476 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjlm" event={"ID":"8a55f1c6-b7f6-46c6-9848-0c29ef363716","Type":"ContainerDied","Data":"2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79"} Mar 18 16:11:38 crc kubenswrapper[4792]: I0318 16:11:38.259520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjlm" event={"ID":"8a55f1c6-b7f6-46c6-9848-0c29ef363716","Type":"ContainerStarted","Data":"7bb6b7de34dc8e0034ec5b37dad41f30c6b29a3897f6735a420b34a6861fea6f"} Mar 18 16:11:39 crc kubenswrapper[4792]: I0318 16:11:39.271316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjlm" event={"ID":"8a55f1c6-b7f6-46c6-9848-0c29ef363716","Type":"ContainerStarted","Data":"0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0"} Mar 18 16:11:40 crc kubenswrapper[4792]: I0318 16:11:40.283029 4792 generic.go:334] "Generic (PLEG): container finished" podID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerID="0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0" exitCode=0 Mar 18 16:11:40 crc kubenswrapper[4792]: I0318 16:11:40.283075 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjlm" event={"ID":"8a55f1c6-b7f6-46c6-9848-0c29ef363716","Type":"ContainerDied","Data":"0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0"} Mar 18 16:11:41 crc kubenswrapper[4792]: I0318 16:11:41.297631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjlm" event={"ID":"8a55f1c6-b7f6-46c6-9848-0c29ef363716","Type":"ContainerStarted","Data":"ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d"} Mar 18 16:11:41 crc kubenswrapper[4792]: I0318 16:11:41.320859 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbjlm" podStartSLOduration=2.8311801450000003 podStartE2EDuration="5.320832965s" podCreationTimestamp="2026-03-18 16:11:36 +0000 UTC" firstStartedPulling="2026-03-18 16:11:38.261157893 +0000 UTC m=+2247.130486830" lastFinishedPulling="2026-03-18 16:11:40.750810713 +0000 UTC m=+2249.620139650" observedRunningTime="2026-03-18 16:11:41.315349463 +0000 UTC m=+2250.184678410" watchObservedRunningTime="2026-03-18 16:11:41.320832965 +0000 UTC m=+2250.190161902" Mar 18 16:11:43 crc kubenswrapper[4792]: I0318 16:11:43.855298 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:11:43 crc kubenswrapper[4792]: E0318 16:11:43.855953 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:11:47 crc kubenswrapper[4792]: I0318 16:11:47.179831 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:47 crc kubenswrapper[4792]: I0318 16:11:47.180534 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:47 crc kubenswrapper[4792]: I0318 16:11:47.231079 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:47 crc kubenswrapper[4792]: I0318 16:11:47.423995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:47 crc kubenswrapper[4792]: I0318 16:11:47.488116 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjlm"] Mar 18 16:11:48 crc kubenswrapper[4792]: I0318 16:11:48.234455 4792 scope.go:117] "RemoveContainer" containerID="f3ad500ff66c09c5b3a9e0e708685b7002f322d759f4c0a0a99c29e4142d13a0" Mar 18 16:11:48 crc kubenswrapper[4792]: I0318 16:11:48.263599 4792 scope.go:117] "RemoveContainer" containerID="cec2c139bce9cdf2b6818641e8e357f68496af8b0060764db9cd7888366647a4" Mar 18 16:11:48 crc kubenswrapper[4792]: I0318 16:11:48.328281 4792 scope.go:117] "RemoveContainer" containerID="2d24ea1bd8d3af1178b2b52d434a09543425388c352c0dfe2afa8d021176fefb" Mar 18 16:11:48 crc kubenswrapper[4792]: I0318 16:11:48.375542 4792 scope.go:117] "RemoveContainer" containerID="d261b7dd451d531e087d82f9347526c698d26ede95d18889e101ab7c17f6cd6c" Mar 18 16:11:48 crc kubenswrapper[4792]: I0318 16:11:48.424716 4792 scope.go:117] "RemoveContainer" containerID="1fb4e977184085feb898aaa04e01d3b12d3b2c0f491e933a04720786f6ea2b7b" Mar 18 16:11:49 crc kubenswrapper[4792]: I0318 16:11:49.417594 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbjlm" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="registry-server" containerID="cri-o://ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d" gracePeriod=2 Mar 18 16:11:49 crc kubenswrapper[4792]: I0318 16:11:49.983893 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.157273 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-utilities\") pod \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.157367 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-catalog-content\") pod \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.157531 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/8a55f1c6-b7f6-46c6-9848-0c29ef363716-kube-api-access-5h75m\") pod \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\" (UID: \"8a55f1c6-b7f6-46c6-9848-0c29ef363716\") " Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.158327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-utilities" (OuterVolumeSpecName: "utilities") pod "8a55f1c6-b7f6-46c6-9848-0c29ef363716" (UID: "8a55f1c6-b7f6-46c6-9848-0c29ef363716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.158496 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.168864 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a55f1c6-b7f6-46c6-9848-0c29ef363716-kube-api-access-5h75m" (OuterVolumeSpecName: "kube-api-access-5h75m") pod "8a55f1c6-b7f6-46c6-9848-0c29ef363716" (UID: "8a55f1c6-b7f6-46c6-9848-0c29ef363716"). InnerVolumeSpecName "kube-api-access-5h75m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.261069 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/8a55f1c6-b7f6-46c6-9848-0c29ef363716-kube-api-access-5h75m\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.433278 4792 generic.go:334] "Generic (PLEG): container finished" podID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerID="ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d" exitCode=0 Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.433337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjlm" event={"ID":"8a55f1c6-b7f6-46c6-9848-0c29ef363716","Type":"ContainerDied","Data":"ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d"} Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.433374 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjlm" event={"ID":"8a55f1c6-b7f6-46c6-9848-0c29ef363716","Type":"ContainerDied","Data":"7bb6b7de34dc8e0034ec5b37dad41f30c6b29a3897f6735a420b34a6861fea6f"} Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.433401 4792 scope.go:117] "RemoveContainer" containerID="ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.433632 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjlm" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.460174 4792 scope.go:117] "RemoveContainer" containerID="0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.486844 4792 scope.go:117] "RemoveContainer" containerID="2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.570060 4792 scope.go:117] "RemoveContainer" containerID="ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d" Mar 18 16:11:50 crc kubenswrapper[4792]: E0318 16:11:50.570890 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d\": container with ID starting with ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d not found: ID does not exist" containerID="ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.570944 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d"} err="failed to get container status \"ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d\": rpc error: code = NotFound desc = could not find container \"ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d\": container with ID starting with ce3d25ecf806a70778f593568b8bd4a7224d5f36a523be037b5425aab9b5de4d not found: ID does not exist" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.570993 4792 scope.go:117] "RemoveContainer" containerID="0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0" Mar 18 16:11:50 crc kubenswrapper[4792]: E0318 16:11:50.571360 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0\": container with ID starting with 0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0 not found: ID does not exist" containerID="0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.571392 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0"} err="failed to get container status \"0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0\": rpc error: code = NotFound desc = could not find container \"0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0\": container with ID starting with 0c10575c33cd91aae76821716d78be8315b10591284f2321f9a1b147289877c0 not found: ID does not exist" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.571416 4792 scope.go:117] "RemoveContainer" containerID="2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79" Mar 18 16:11:50 crc kubenswrapper[4792]: E0318 16:11:50.571883 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79\": container with ID starting with 2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79 not found: ID does not exist" containerID="2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.571910 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79"} err="failed to get container status \"2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79\": rpc error: code = NotFound desc = could not find container \"2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79\": container with ID starting with 2a77ea2f8fd3f485644155255f763225c2fb8c6e365c1534242fa7b8a6620c79 not found: ID does not exist" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.674098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a55f1c6-b7f6-46c6-9848-0c29ef363716" (UID: "8a55f1c6-b7f6-46c6-9848-0c29ef363716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.775597 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a55f1c6-b7f6-46c6-9848-0c29ef363716-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.776778 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjlm"] Mar 18 16:11:50 crc kubenswrapper[4792]: I0318 16:11:50.786434 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjlm"] Mar 18 16:11:51 crc kubenswrapper[4792]: I0318 16:11:51.869572 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" path="/var/lib/kubelet/pods/8a55f1c6-b7f6-46c6-9848-0c29ef363716/volumes" Mar 18 16:11:53 crc kubenswrapper[4792]: I0318 16:11:53.467928 4792 generic.go:334] "Generic (PLEG): container finished" podID="7cb772fb-4950-4c27-b7c5-28ca75682b99" containerID="6665e09e20cfc231eb3062bfb680689c6766c80b8515955b9d8af72f0c29b1df" exitCode=0 Mar 18 16:11:53 crc kubenswrapper[4792]: I0318 16:11:53.468053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" event={"ID":"7cb772fb-4950-4c27-b7c5-28ca75682b99","Type":"ContainerDied","Data":"6665e09e20cfc231eb3062bfb680689c6766c80b8515955b9d8af72f0c29b1df"} Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.002585 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.188271 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-ssh-key-openstack-edpm-ipam\") pod \"7cb772fb-4950-4c27-b7c5-28ca75682b99\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.188498 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-inventory\") pod \"7cb772fb-4950-4c27-b7c5-28ca75682b99\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.188563 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsvtc\" (UniqueName: \"kubernetes.io/projected/7cb772fb-4950-4c27-b7c5-28ca75682b99-kube-api-access-xsvtc\") pod \"7cb772fb-4950-4c27-b7c5-28ca75682b99\" (UID: \"7cb772fb-4950-4c27-b7c5-28ca75682b99\") " Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.195751 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb772fb-4950-4c27-b7c5-28ca75682b99-kube-api-access-xsvtc" (OuterVolumeSpecName: "kube-api-access-xsvtc") pod "7cb772fb-4950-4c27-b7c5-28ca75682b99" (UID: "7cb772fb-4950-4c27-b7c5-28ca75682b99"). InnerVolumeSpecName "kube-api-access-xsvtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.222320 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-inventory" (OuterVolumeSpecName: "inventory") pod "7cb772fb-4950-4c27-b7c5-28ca75682b99" (UID: "7cb772fb-4950-4c27-b7c5-28ca75682b99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.222706 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7cb772fb-4950-4c27-b7c5-28ca75682b99" (UID: "7cb772fb-4950-4c27-b7c5-28ca75682b99"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.291752 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.292056 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb772fb-4950-4c27-b7c5-28ca75682b99-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.292131 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsvtc\" (UniqueName: \"kubernetes.io/projected/7cb772fb-4950-4c27-b7c5-28ca75682b99-kube-api-access-xsvtc\") on node \"crc\" DevicePath \"\"" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.499137 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" event={"ID":"7cb772fb-4950-4c27-b7c5-28ca75682b99","Type":"ContainerDied","Data":"51bc7701f2c260d6d0d491c679b6919cb1043eb04654c413b800c75afd7056ad"} Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.499762 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51bc7701f2c260d6d0d491c679b6919cb1043eb04654c413b800c75afd7056ad" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.499175 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.584392 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sqqx6"] Mar 18 16:11:55 crc kubenswrapper[4792]: E0318 16:11:55.584916 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="extract-utilities" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.584934 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="extract-utilities" Mar 18 16:11:55 crc kubenswrapper[4792]: E0318 16:11:55.584944 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb772fb-4950-4c27-b7c5-28ca75682b99" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.584954 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb772fb-4950-4c27-b7c5-28ca75682b99" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:55 crc kubenswrapper[4792]: E0318 16:11:55.584994 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="extract-content" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.585003 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="extract-content" Mar 18 16:11:55 crc kubenswrapper[4792]: E0318 16:11:55.585019 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="registry-server" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.585027 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="registry-server" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.585333 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb772fb-4950-4c27-b7c5-28ca75682b99" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.585372 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a55f1c6-b7f6-46c6-9848-0c29ef363716" containerName="registry-server" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.586262 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.589727 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.589935 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.590026 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.590155 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.597319 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sqqx6"] Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.700656 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdzp\" (UniqueName: \"kubernetes.io/projected/41821b7e-a517-44da-8768-18f9246d5bc2-kube-api-access-jcdzp\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.700864 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.701001 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.803232 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.803336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.803457 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdzp\" (UniqueName: \"kubernetes.io/projected/41821b7e-a517-44da-8768-18f9246d5bc2-kube-api-access-jcdzp\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.807403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.817405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.833380 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdzp\" (UniqueName: \"kubernetes.io/projected/41821b7e-a517-44da-8768-18f9246d5bc2-kube-api-access-jcdzp\") pod \"ssh-known-hosts-edpm-deployment-sqqx6\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:55 crc kubenswrapper[4792]: I0318 16:11:55.915991 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:11:56 crc kubenswrapper[4792]: I0318 16:11:56.451342 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sqqx6"] Mar 18 16:11:56 crc kubenswrapper[4792]: I0318 16:11:56.509523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" event={"ID":"41821b7e-a517-44da-8768-18f9246d5bc2","Type":"ContainerStarted","Data":"8e78b441eed219edd6ebf3e0b9444cf91992eea2bc5a7009ddbfbb11b52a2da6"} Mar 18 16:11:57 crc kubenswrapper[4792]: I0318 16:11:57.522239 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" event={"ID":"41821b7e-a517-44da-8768-18f9246d5bc2","Type":"ContainerStarted","Data":"62f8688a117c1e8122db2dd1734674d5ab464fcf7f5824ffc5d2be00e5a4f911"} Mar 18 16:11:57 crc kubenswrapper[4792]: I0318 16:11:57.539340 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" podStartSLOduration=2.095975151 podStartE2EDuration="2.539318117s" podCreationTimestamp="2026-03-18 16:11:55 +0000 UTC" firstStartedPulling="2026-03-18 16:11:56.463819448 +0000 UTC m=+2265.333148385" lastFinishedPulling="2026-03-18 16:11:56.907162414 +0000 UTC m=+2265.776491351" observedRunningTime="2026-03-18 16:11:57.536405266 +0000 UTC m=+2266.405734203" watchObservedRunningTime="2026-03-18 16:11:57.539318117 +0000 UTC m=+2266.408647054" Mar 18 16:11:58 crc kubenswrapper[4792]: I0318 16:11:58.855306 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:11:58 crc kubenswrapper[4792]: E0318 16:11:58.856539 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.142285 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564172-97lzn"] Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.145786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-97lzn" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.151520 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.152067 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.152768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.158567 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-97lzn"] Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.227540 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55dbt\" (UniqueName: \"kubernetes.io/projected/577931be-2eb9-4fc1-bbc3-9e552ede9dc7-kube-api-access-55dbt\") pod \"auto-csr-approver-29564172-97lzn\" (UID: \"577931be-2eb9-4fc1-bbc3-9e552ede9dc7\") " pod="openshift-infra/auto-csr-approver-29564172-97lzn" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.329665 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55dbt\" (UniqueName: \"kubernetes.io/projected/577931be-2eb9-4fc1-bbc3-9e552ede9dc7-kube-api-access-55dbt\") pod \"auto-csr-approver-29564172-97lzn\" (UID: \"577931be-2eb9-4fc1-bbc3-9e552ede9dc7\") " pod="openshift-infra/auto-csr-approver-29564172-97lzn" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.347808 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55dbt\" (UniqueName: \"kubernetes.io/projected/577931be-2eb9-4fc1-bbc3-9e552ede9dc7-kube-api-access-55dbt\") pod \"auto-csr-approver-29564172-97lzn\" (UID: \"577931be-2eb9-4fc1-bbc3-9e552ede9dc7\") " pod="openshift-infra/auto-csr-approver-29564172-97lzn" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.471836 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-97lzn" Mar 18 16:12:00 crc kubenswrapper[4792]: I0318 16:12:00.970269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-97lzn"] Mar 18 16:12:00 crc kubenswrapper[4792]: W0318 16:12:00.973373 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod577931be_2eb9_4fc1_bbc3_9e552ede9dc7.slice/crio-811643804a4e762967a7056093d9770e06c23934c3e19215fa6206681dd99247 WatchSource:0}: Error finding container 811643804a4e762967a7056093d9770e06c23934c3e19215fa6206681dd99247: Status 404 returned error can't find the container with id 811643804a4e762967a7056093d9770e06c23934c3e19215fa6206681dd99247 Mar 18 16:12:01 crc kubenswrapper[4792]: I0318 16:12:01.564727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-97lzn" event={"ID":"577931be-2eb9-4fc1-bbc3-9e552ede9dc7","Type":"ContainerStarted","Data":"811643804a4e762967a7056093d9770e06c23934c3e19215fa6206681dd99247"} Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.605735 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" event={"ID":"41821b7e-a517-44da-8768-18f9246d5bc2","Type":"ContainerDied","Data":"62f8688a117c1e8122db2dd1734674d5ab464fcf7f5824ffc5d2be00e5a4f911"} Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.605626 4792 generic.go:334] "Generic (PLEG): container finished" podID="41821b7e-a517-44da-8768-18f9246d5bc2" containerID="62f8688a117c1e8122db2dd1734674d5ab464fcf7f5824ffc5d2be00e5a4f911" exitCode=0 Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.608207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-97lzn" event={"ID":"577931be-2eb9-4fc1-bbc3-9e552ede9dc7","Type":"ContainerStarted","Data":"2cb7c3f4ee8e1682c4f0198e11c4087871da32cc1cd6dce9e2dfa86495df539f"} Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.669219 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f7qqp"] Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.671839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.684289 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7qqp"] Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.690236 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564172-97lzn" podStartSLOduration=2.28909967 podStartE2EDuration="3.690214282s" podCreationTimestamp="2026-03-18 16:12:00 +0000 UTC" firstStartedPulling="2026-03-18 16:12:00.976482698 +0000 UTC m=+2269.845811635" lastFinishedPulling="2026-03-18 16:12:02.37759731 +0000 UTC m=+2271.246926247" observedRunningTime="2026-03-18 16:12:03.652557359 +0000 UTC m=+2272.521886296" watchObservedRunningTime="2026-03-18 16:12:03.690214282 +0000 UTC m=+2272.559543229" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.740028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbgd\" (UniqueName: \"kubernetes.io/projected/d5fe552d-9a9c-4186-9f66-985dc5da2142-kube-api-access-ssbgd\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.740145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-catalog-content\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.740496 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-utilities\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.842254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-catalog-content\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.842385 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-utilities\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.842480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbgd\" (UniqueName: \"kubernetes.io/projected/d5fe552d-9a9c-4186-9f66-985dc5da2142-kube-api-access-ssbgd\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.842731 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-catalog-content\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.842807 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-utilities\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:03 crc kubenswrapper[4792]: I0318 16:12:03.870054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbgd\" (UniqueName: \"kubernetes.io/projected/d5fe552d-9a9c-4186-9f66-985dc5da2142-kube-api-access-ssbgd\") pod \"community-operators-f7qqp\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:04 crc kubenswrapper[4792]: I0318 16:12:04.003671 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:04 crc kubenswrapper[4792]: I0318 16:12:04.586735 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7qqp"] Mar 18 16:12:04 crc kubenswrapper[4792]: W0318 16:12:04.590542 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5fe552d_9a9c_4186_9f66_985dc5da2142.slice/crio-361d6b3a7286af2735ae3cf0b55f0a008420177148e600db2b73dd56a75240c8 WatchSource:0}: Error finding container 361d6b3a7286af2735ae3cf0b55f0a008420177148e600db2b73dd56a75240c8: Status 404 returned error can't find the container with id 361d6b3a7286af2735ae3cf0b55f0a008420177148e600db2b73dd56a75240c8 Mar 18 16:12:04 crc kubenswrapper[4792]: I0318 16:12:04.622837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7qqp" event={"ID":"d5fe552d-9a9c-4186-9f66-985dc5da2142","Type":"ContainerStarted","Data":"361d6b3a7286af2735ae3cf0b55f0a008420177148e600db2b73dd56a75240c8"} Mar 18 16:12:04 crc kubenswrapper[4792]: I0318 16:12:04.626775 4792 generic.go:334] "Generic (PLEG): container finished" podID="577931be-2eb9-4fc1-bbc3-9e552ede9dc7" containerID="2cb7c3f4ee8e1682c4f0198e11c4087871da32cc1cd6dce9e2dfa86495df539f" exitCode=0 Mar 18 16:12:04 crc kubenswrapper[4792]: I0318 16:12:04.627596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-97lzn" event={"ID":"577931be-2eb9-4fc1-bbc3-9e552ede9dc7","Type":"ContainerDied","Data":"2cb7c3f4ee8e1682c4f0198e11c4087871da32cc1cd6dce9e2dfa86495df539f"} Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.111635 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.171516 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-ssh-key-openstack-edpm-ipam\") pod \"41821b7e-a517-44da-8768-18f9246d5bc2\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.171569 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcdzp\" (UniqueName: \"kubernetes.io/projected/41821b7e-a517-44da-8768-18f9246d5bc2-kube-api-access-jcdzp\") pod \"41821b7e-a517-44da-8768-18f9246d5bc2\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.171662 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-inventory-0\") pod \"41821b7e-a517-44da-8768-18f9246d5bc2\" (UID: \"41821b7e-a517-44da-8768-18f9246d5bc2\") " Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.179045 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41821b7e-a517-44da-8768-18f9246d5bc2-kube-api-access-jcdzp" (OuterVolumeSpecName: "kube-api-access-jcdzp") pod "41821b7e-a517-44da-8768-18f9246d5bc2" (UID: "41821b7e-a517-44da-8768-18f9246d5bc2"). InnerVolumeSpecName "kube-api-access-jcdzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.207760 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "41821b7e-a517-44da-8768-18f9246d5bc2" (UID: "41821b7e-a517-44da-8768-18f9246d5bc2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.215612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "41821b7e-a517-44da-8768-18f9246d5bc2" (UID: "41821b7e-a517-44da-8768-18f9246d5bc2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.273954 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.274155 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcdzp\" (UniqueName: \"kubernetes.io/projected/41821b7e-a517-44da-8768-18f9246d5bc2-kube-api-access-jcdzp\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.274213 4792 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/41821b7e-a517-44da-8768-18f9246d5bc2-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.642506 4792 generic.go:334] "Generic (PLEG): container finished" podID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerID="861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650" exitCode=0 Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.642567 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7qqp" event={"ID":"d5fe552d-9a9c-4186-9f66-985dc5da2142","Type":"ContainerDied","Data":"861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650"} Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.660790 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.662226 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sqqx6" event={"ID":"41821b7e-a517-44da-8768-18f9246d5bc2","Type":"ContainerDied","Data":"8e78b441eed219edd6ebf3e0b9444cf91992eea2bc5a7009ddbfbb11b52a2da6"} Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.662279 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e78b441eed219edd6ebf3e0b9444cf91992eea2bc5a7009ddbfbb11b52a2da6" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.752141 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5"] Mar 18 16:12:05 crc kubenswrapper[4792]: E0318 16:12:05.752716 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41821b7e-a517-44da-8768-18f9246d5bc2" containerName="ssh-known-hosts-edpm-deployment" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.752734 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="41821b7e-a517-44da-8768-18f9246d5bc2" containerName="ssh-known-hosts-edpm-deployment" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.753035 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="41821b7e-a517-44da-8768-18f9246d5bc2" containerName="ssh-known-hosts-edpm-deployment" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.753943 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.756459 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.756803 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.757111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.757361 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.763825 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5"] Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.786783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wgmj\" (UniqueName: \"kubernetes.io/projected/b7c1917f-d843-42d7-9473-ab2f98a7edf6-kube-api-access-5wgmj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.786863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.786986 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.889571 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wgmj\" (UniqueName: \"kubernetes.io/projected/b7c1917f-d843-42d7-9473-ab2f98a7edf6-kube-api-access-5wgmj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.889618 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.889778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.906191 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.906584 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.909768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wgmj\" (UniqueName: \"kubernetes.io/projected/b7c1917f-d843-42d7-9473-ab2f98a7edf6-kube-api-access-5wgmj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xqcf5\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.977340 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-97lzn" Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.991794 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55dbt\" (UniqueName: \"kubernetes.io/projected/577931be-2eb9-4fc1-bbc3-9e552ede9dc7-kube-api-access-55dbt\") pod \"577931be-2eb9-4fc1-bbc3-9e552ede9dc7\" (UID: \"577931be-2eb9-4fc1-bbc3-9e552ede9dc7\") " Mar 18 16:12:05 crc kubenswrapper[4792]: I0318 16:12:05.997293 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577931be-2eb9-4fc1-bbc3-9e552ede9dc7-kube-api-access-55dbt" (OuterVolumeSpecName: "kube-api-access-55dbt") pod "577931be-2eb9-4fc1-bbc3-9e552ede9dc7" (UID: "577931be-2eb9-4fc1-bbc3-9e552ede9dc7"). InnerVolumeSpecName "kube-api-access-55dbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:06 crc kubenswrapper[4792]: I0318 16:12:06.074939 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:06 crc kubenswrapper[4792]: I0318 16:12:06.095023 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55dbt\" (UniqueName: \"kubernetes.io/projected/577931be-2eb9-4fc1-bbc3-9e552ede9dc7-kube-api-access-55dbt\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:06 crc kubenswrapper[4792]: I0318 16:12:06.641097 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5"] Mar 18 16:12:06 crc kubenswrapper[4792]: I0318 16:12:06.680325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564172-97lzn" event={"ID":"577931be-2eb9-4fc1-bbc3-9e552ede9dc7","Type":"ContainerDied","Data":"811643804a4e762967a7056093d9770e06c23934c3e19215fa6206681dd99247"} Mar 18 16:12:06 crc kubenswrapper[4792]: I0318 16:12:06.680388 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="811643804a4e762967a7056093d9770e06c23934c3e19215fa6206681dd99247" Mar 18 16:12:06 crc kubenswrapper[4792]: I0318 16:12:06.680404 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564172-97lzn" Mar 18 16:12:06 crc kubenswrapper[4792]: I0318 16:12:06.684315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" event={"ID":"b7c1917f-d843-42d7-9473-ab2f98a7edf6","Type":"ContainerStarted","Data":"9b27761ee88b698fefeed9d4599bada23403e3a1ec7ae9b1733fc57204f1bd92"} Mar 18 16:12:07 crc kubenswrapper[4792]: I0318 16:12:07.060108 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-6shz9"] Mar 18 16:12:07 crc kubenswrapper[4792]: I0318 16:12:07.075734 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564166-6shz9"] Mar 18 16:12:07 crc kubenswrapper[4792]: I0318 16:12:07.695760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7qqp" event={"ID":"d5fe552d-9a9c-4186-9f66-985dc5da2142","Type":"ContainerStarted","Data":"d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8"} Mar 18 16:12:07 crc kubenswrapper[4792]: I0318 16:12:07.869139 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a53da3c-f3b8-4a0b-8ce6-c71399912c79" path="/var/lib/kubelet/pods/9a53da3c-f3b8-4a0b-8ce6-c71399912c79/volumes" Mar 18 16:12:08 crc kubenswrapper[4792]: I0318 16:12:08.710706 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" event={"ID":"b7c1917f-d843-42d7-9473-ab2f98a7edf6","Type":"ContainerStarted","Data":"a15178304e6e52ff2a3a6b8c0beb2858e542afb631958dffb06a38de27bb766d"} Mar 18 16:12:08 crc kubenswrapper[4792]: I0318 16:12:08.713828 4792 generic.go:334] "Generic (PLEG): container finished" podID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerID="d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8" exitCode=0 Mar 18 16:12:08 crc kubenswrapper[4792]: I0318 16:12:08.713907 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7qqp" event={"ID":"d5fe552d-9a9c-4186-9f66-985dc5da2142","Type":"ContainerDied","Data":"d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8"} Mar 18 16:12:08 crc kubenswrapper[4792]: I0318 16:12:08.801298 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" podStartSLOduration=2.2578460590000002 podStartE2EDuration="3.801267196s" podCreationTimestamp="2026-03-18 16:12:05 +0000 UTC" firstStartedPulling="2026-03-18 16:12:06.661351365 +0000 UTC m=+2275.530680302" lastFinishedPulling="2026-03-18 16:12:08.204772502 +0000 UTC m=+2277.074101439" observedRunningTime="2026-03-18 16:12:08.744908827 +0000 UTC m=+2277.614237764" watchObservedRunningTime="2026-03-18 16:12:08.801267196 +0000 UTC m=+2277.670596173" Mar 18 16:12:10 crc kubenswrapper[4792]: I0318 16:12:10.855039 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:12:10 crc kubenswrapper[4792]: E0318 16:12:10.855911 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:12:13 crc kubenswrapper[4792]: I0318 16:12:13.769447 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7qqp" event={"ID":"d5fe552d-9a9c-4186-9f66-985dc5da2142","Type":"ContainerStarted","Data":"5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c"} Mar 18 16:12:13 crc kubenswrapper[4792]: I0318 16:12:13.795448 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f7qqp" podStartSLOduration=3.2581801280000002 podStartE2EDuration="10.7954239s" podCreationTimestamp="2026-03-18 16:12:03 +0000 UTC" firstStartedPulling="2026-03-18 16:12:05.646448337 +0000 UTC m=+2274.515777274" lastFinishedPulling="2026-03-18 16:12:13.183692069 +0000 UTC m=+2282.053021046" observedRunningTime="2026-03-18 16:12:13.787416609 +0000 UTC m=+2282.656745556" watchObservedRunningTime="2026-03-18 16:12:13.7954239 +0000 UTC m=+2282.664752837" Mar 18 16:12:14 crc kubenswrapper[4792]: I0318 16:12:14.004817 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:14 crc kubenswrapper[4792]: I0318 16:12:14.004875 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:15 crc kubenswrapper[4792]: I0318 16:12:15.073892 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-f7qqp" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="registry-server" probeResult="failure" output=< Mar 18 16:12:15 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:12:15 crc kubenswrapper[4792]: > Mar 18 16:12:15 crc kubenswrapper[4792]: I0318 16:12:15.792502 4792 generic.go:334] "Generic (PLEG): container finished" podID="b7c1917f-d843-42d7-9473-ab2f98a7edf6" containerID="a15178304e6e52ff2a3a6b8c0beb2858e542afb631958dffb06a38de27bb766d" exitCode=0 Mar 18 16:12:15 crc kubenswrapper[4792]: I0318 16:12:15.792567 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" event={"ID":"b7c1917f-d843-42d7-9473-ab2f98a7edf6","Type":"ContainerDied","Data":"a15178304e6e52ff2a3a6b8c0beb2858e542afb631958dffb06a38de27bb766d"} Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.269715 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.369829 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wgmj\" (UniqueName: \"kubernetes.io/projected/b7c1917f-d843-42d7-9473-ab2f98a7edf6-kube-api-access-5wgmj\") pod \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.371002 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-ssh-key-openstack-edpm-ipam\") pod \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.371118 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-inventory\") pod \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\" (UID: \"b7c1917f-d843-42d7-9473-ab2f98a7edf6\") " Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.376247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7c1917f-d843-42d7-9473-ab2f98a7edf6-kube-api-access-5wgmj" (OuterVolumeSpecName: "kube-api-access-5wgmj") pod "b7c1917f-d843-42d7-9473-ab2f98a7edf6" (UID: "b7c1917f-d843-42d7-9473-ab2f98a7edf6"). InnerVolumeSpecName "kube-api-access-5wgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.412494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b7c1917f-d843-42d7-9473-ab2f98a7edf6" (UID: "b7c1917f-d843-42d7-9473-ab2f98a7edf6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.422448 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-inventory" (OuterVolumeSpecName: "inventory") pod "b7c1917f-d843-42d7-9473-ab2f98a7edf6" (UID: "b7c1917f-d843-42d7-9473-ab2f98a7edf6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.474466 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wgmj\" (UniqueName: \"kubernetes.io/projected/b7c1917f-d843-42d7-9473-ab2f98a7edf6-kube-api-access-5wgmj\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.474792 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.474812 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7c1917f-d843-42d7-9473-ab2f98a7edf6-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.813307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" event={"ID":"b7c1917f-d843-42d7-9473-ab2f98a7edf6","Type":"ContainerDied","Data":"9b27761ee88b698fefeed9d4599bada23403e3a1ec7ae9b1733fc57204f1bd92"} Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.813350 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b27761ee88b698fefeed9d4599bada23403e3a1ec7ae9b1733fc57204f1bd92" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.813664 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xqcf5" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.894395 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq"] Mar 18 16:12:17 crc kubenswrapper[4792]: E0318 16:12:17.895108 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c1917f-d843-42d7-9473-ab2f98a7edf6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.895130 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c1917f-d843-42d7-9473-ab2f98a7edf6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:17 crc kubenswrapper[4792]: E0318 16:12:17.895147 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577931be-2eb9-4fc1-bbc3-9e552ede9dc7" containerName="oc" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.895155 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="577931be-2eb9-4fc1-bbc3-9e552ede9dc7" containerName="oc" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.895536 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="577931be-2eb9-4fc1-bbc3-9e552ede9dc7" containerName="oc" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.895562 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c1917f-d843-42d7-9473-ab2f98a7edf6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.896630 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.905640 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.906245 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq"] Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.907023 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.907115 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:12:17 crc kubenswrapper[4792]: I0318 16:12:17.907387 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.092028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.092381 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l69xm\" (UniqueName: \"kubernetes.io/projected/df844859-17d2-4d51-9eb5-f4c9148267cb-kube-api-access-l69xm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.092515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.194583 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.194639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l69xm\" (UniqueName: \"kubernetes.io/projected/df844859-17d2-4d51-9eb5-f4c9148267cb-kube-api-access-l69xm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.194709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.198678 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.198941 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.218189 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l69xm\" (UniqueName: \"kubernetes.io/projected/df844859-17d2-4d51-9eb5-f4c9148267cb-kube-api-access-l69xm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.234189 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.788861 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq"] Mar 18 16:12:18 crc kubenswrapper[4792]: I0318 16:12:18.827273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" event={"ID":"df844859-17d2-4d51-9eb5-f4c9148267cb","Type":"ContainerStarted","Data":"c5dfcf9ec2bd8cc0ad0d2e53d0157dfde033f39fe47536e710173139b7ed3e8f"} Mar 18 16:12:19 crc kubenswrapper[4792]: I0318 16:12:19.837714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" event={"ID":"df844859-17d2-4d51-9eb5-f4c9148267cb","Type":"ContainerStarted","Data":"9097499fb29b3f90aabcc5747c770025af5c8e6f5341e9ee1d91b77fd0b43cb5"} Mar 18 16:12:19 crc kubenswrapper[4792]: I0318 16:12:19.856839 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" podStartSLOduration=2.420497509 podStartE2EDuration="2.856824255s" podCreationTimestamp="2026-03-18 16:12:17 +0000 UTC" firstStartedPulling="2026-03-18 16:12:18.80277286 +0000 UTC m=+2287.672101797" lastFinishedPulling="2026-03-18 16:12:19.239099596 +0000 UTC m=+2288.108428543" observedRunningTime="2026-03-18 16:12:19.856287408 +0000 UTC m=+2288.725616345" watchObservedRunningTime="2026-03-18 16:12:19.856824255 +0000 UTC m=+2288.726153192" Mar 18 16:12:22 crc kubenswrapper[4792]: I0318 16:12:22.855411 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:12:22 crc kubenswrapper[4792]: E0318 16:12:22.856981 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:12:24 crc kubenswrapper[4792]: I0318 16:12:24.069852 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:24 crc kubenswrapper[4792]: I0318 16:12:24.132117 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:24 crc kubenswrapper[4792]: I0318 16:12:24.311864 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7qqp"] Mar 18 16:12:25 crc kubenswrapper[4792]: I0318 16:12:25.902132 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f7qqp" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="registry-server" containerID="cri-o://5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c" gracePeriod=2 Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.425678 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.528222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-utilities\") pod \"d5fe552d-9a9c-4186-9f66-985dc5da2142\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.528337 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-catalog-content\") pod \"d5fe552d-9a9c-4186-9f66-985dc5da2142\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.528381 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbgd\" (UniqueName: \"kubernetes.io/projected/d5fe552d-9a9c-4186-9f66-985dc5da2142-kube-api-access-ssbgd\") pod \"d5fe552d-9a9c-4186-9f66-985dc5da2142\" (UID: \"d5fe552d-9a9c-4186-9f66-985dc5da2142\") " Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.539404 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-utilities" (OuterVolumeSpecName: "utilities") pod "d5fe552d-9a9c-4186-9f66-985dc5da2142" (UID: "d5fe552d-9a9c-4186-9f66-985dc5da2142"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.539628 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5fe552d-9a9c-4186-9f66-985dc5da2142-kube-api-access-ssbgd" (OuterVolumeSpecName: "kube-api-access-ssbgd") pod "d5fe552d-9a9c-4186-9f66-985dc5da2142" (UID: "d5fe552d-9a9c-4186-9f66-985dc5da2142"). InnerVolumeSpecName "kube-api-access-ssbgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.581621 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5fe552d-9a9c-4186-9f66-985dc5da2142" (UID: "d5fe552d-9a9c-4186-9f66-985dc5da2142"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.631362 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.631407 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5fe552d-9a9c-4186-9f66-985dc5da2142-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.631425 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbgd\" (UniqueName: \"kubernetes.io/projected/d5fe552d-9a9c-4186-9f66-985dc5da2142-kube-api-access-ssbgd\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.914145 4792 generic.go:334] "Generic (PLEG): container finished" podID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerID="5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c" exitCode=0 Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.914241 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7qqp" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.914243 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7qqp" event={"ID":"d5fe552d-9a9c-4186-9f66-985dc5da2142","Type":"ContainerDied","Data":"5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c"} Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.914557 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7qqp" event={"ID":"d5fe552d-9a9c-4186-9f66-985dc5da2142","Type":"ContainerDied","Data":"361d6b3a7286af2735ae3cf0b55f0a008420177148e600db2b73dd56a75240c8"} Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.914580 4792 scope.go:117] "RemoveContainer" containerID="5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.956400 4792 scope.go:117] "RemoveContainer" containerID="d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8" Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.964936 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7qqp"] Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.975552 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f7qqp"] Mar 18 16:12:26 crc kubenswrapper[4792]: I0318 16:12:26.977546 4792 scope.go:117] "RemoveContainer" containerID="861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650" Mar 18 16:12:27 crc kubenswrapper[4792]: I0318 16:12:27.037709 4792 scope.go:117] "RemoveContainer" containerID="5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c" Mar 18 16:12:27 crc kubenswrapper[4792]: E0318 16:12:27.038277 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c\": container with ID starting with 5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c not found: ID does not exist" containerID="5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c" Mar 18 16:12:27 crc kubenswrapper[4792]: I0318 16:12:27.038334 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c"} err="failed to get container status \"5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c\": rpc error: code = NotFound desc = could not find container \"5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c\": container with ID starting with 5ab6392fde454a8da872ac7f8c5f8f67e1fc6f98f8d6b18903f118e7d5ad643c not found: ID does not exist" Mar 18 16:12:27 crc kubenswrapper[4792]: I0318 16:12:27.038370 4792 scope.go:117] "RemoveContainer" containerID="d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8" Mar 18 16:12:27 crc kubenswrapper[4792]: E0318 16:12:27.038711 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8\": container with ID starting with d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8 not found: ID does not exist" containerID="d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8" Mar 18 16:12:27 crc kubenswrapper[4792]: I0318 16:12:27.038738 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8"} err="failed to get container status \"d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8\": rpc error: code = NotFound desc = could not find container \"d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8\": container with ID starting with d9e0fbf7a6a39b76d511fc812b8eb885c415dca15d3ac6bb2d581e1ea3afdcb8 not found: ID does not exist" Mar 18 16:12:27 crc kubenswrapper[4792]: I0318 16:12:27.038757 4792 scope.go:117] "RemoveContainer" containerID="861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650" Mar 18 16:12:27 crc kubenswrapper[4792]: E0318 16:12:27.039113 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650\": container with ID starting with 861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650 not found: ID does not exist" containerID="861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650" Mar 18 16:12:27 crc kubenswrapper[4792]: I0318 16:12:27.039155 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650"} err="failed to get container status \"861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650\": rpc error: code = NotFound desc = could not find container \"861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650\": container with ID starting with 861e98d45bf4bccd8af567c7775e2793258313ac843873e346345ae731339650 not found: ID does not exist" Mar 18 16:12:27 crc kubenswrapper[4792]: I0318 16:12:27.866292 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" path="/var/lib/kubelet/pods/d5fe552d-9a9c-4186-9f66-985dc5da2142/volumes" Mar 18 16:12:29 crc kubenswrapper[4792]: I0318 16:12:29.948519 4792 generic.go:334] "Generic (PLEG): container finished" podID="df844859-17d2-4d51-9eb5-f4c9148267cb" containerID="9097499fb29b3f90aabcc5747c770025af5c8e6f5341e9ee1d91b77fd0b43cb5" exitCode=0 Mar 18 16:12:29 crc kubenswrapper[4792]: I0318 16:12:29.948597 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" event={"ID":"df844859-17d2-4d51-9eb5-f4c9148267cb","Type":"ContainerDied","Data":"9097499fb29b3f90aabcc5747c770025af5c8e6f5341e9ee1d91b77fd0b43cb5"} Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.451512 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.572157 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-inventory\") pod \"df844859-17d2-4d51-9eb5-f4c9148267cb\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.572330 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-ssh-key-openstack-edpm-ipam\") pod \"df844859-17d2-4d51-9eb5-f4c9148267cb\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.572458 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l69xm\" (UniqueName: \"kubernetes.io/projected/df844859-17d2-4d51-9eb5-f4c9148267cb-kube-api-access-l69xm\") pod \"df844859-17d2-4d51-9eb5-f4c9148267cb\" (UID: \"df844859-17d2-4d51-9eb5-f4c9148267cb\") " Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.578001 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df844859-17d2-4d51-9eb5-f4c9148267cb-kube-api-access-l69xm" (OuterVolumeSpecName: "kube-api-access-l69xm") pod "df844859-17d2-4d51-9eb5-f4c9148267cb" (UID: "df844859-17d2-4d51-9eb5-f4c9148267cb"). InnerVolumeSpecName "kube-api-access-l69xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.605104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "df844859-17d2-4d51-9eb5-f4c9148267cb" (UID: "df844859-17d2-4d51-9eb5-f4c9148267cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.620115 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-inventory" (OuterVolumeSpecName: "inventory") pod "df844859-17d2-4d51-9eb5-f4c9148267cb" (UID: "df844859-17d2-4d51-9eb5-f4c9148267cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.676503 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.676763 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l69xm\" (UniqueName: \"kubernetes.io/projected/df844859-17d2-4d51-9eb5-f4c9148267cb-kube-api-access-l69xm\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.676835 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df844859-17d2-4d51-9eb5-f4c9148267cb-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.969089 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" event={"ID":"df844859-17d2-4d51-9eb5-f4c9148267cb","Type":"ContainerDied","Data":"c5dfcf9ec2bd8cc0ad0d2e53d0157dfde033f39fe47536e710173139b7ed3e8f"} Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.969129 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5dfcf9ec2bd8cc0ad0d2e53d0157dfde033f39fe47536e710173139b7ed3e8f" Mar 18 16:12:31 crc kubenswrapper[4792]: I0318 16:12:31.969167 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.199322 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj"] Mar 18 16:12:32 crc kubenswrapper[4792]: E0318 16:12:32.200095 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="registry-server" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.200118 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="registry-server" Mar 18 16:12:32 crc kubenswrapper[4792]: E0318 16:12:32.200133 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="extract-content" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.200140 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="extract-content" Mar 18 16:12:32 crc kubenswrapper[4792]: E0318 16:12:32.200169 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df844859-17d2-4d51-9eb5-f4c9148267cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.200177 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="df844859-17d2-4d51-9eb5-f4c9148267cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:32 crc kubenswrapper[4792]: E0318 16:12:32.200199 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="extract-utilities" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.200206 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="extract-utilities" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.200496 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="df844859-17d2-4d51-9eb5-f4c9148267cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.200552 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5fe552d-9a9c-4186-9f66-985dc5da2142" containerName="registry-server" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.201438 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.205053 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.205190 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.205190 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.205246 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.205252 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.207273 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.210924 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.215327 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.215847 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.225062 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj"] Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.292912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.292964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293217 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293554 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293606 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293781 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75vj\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-kube-api-access-l75vj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293834 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.293879 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396321 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396346 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396490 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75vj\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-kube-api-access-l75vj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396781 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.396809 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.405308 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.405898 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.408565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.408941 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.409473 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.411699 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.411862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.411989 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.414817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.416702 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.417754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.417926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.418733 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.420553 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75vj\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-kube-api-access-l75vj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.421289 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.422543 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-77zpj\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:32 crc kubenswrapper[4792]: I0318 16:12:32.519348 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:12:33 crc kubenswrapper[4792]: I0318 16:12:33.061392 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj"] Mar 18 16:12:33 crc kubenswrapper[4792]: W0318 16:12:33.065165 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4620d8fc_6dea_47d8_9d4a_fd9e8ceb2116.slice/crio-23e71a1349bdfdb1895ff6d92a199da04ca6f2ce560f864bbf68468eb486aca3 WatchSource:0}: Error finding container 23e71a1349bdfdb1895ff6d92a199da04ca6f2ce560f864bbf68468eb486aca3: Status 404 returned error can't find the container with id 23e71a1349bdfdb1895ff6d92a199da04ca6f2ce560f864bbf68468eb486aca3 Mar 18 16:12:33 crc kubenswrapper[4792]: I0318 16:12:33.992405 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" event={"ID":"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116","Type":"ContainerStarted","Data":"60a8df6a82bf1e934c8e115262b8674e7df3181bb1e9c9a6cc76e0cde5ecb44f"} Mar 18 16:12:33 crc kubenswrapper[4792]: I0318 16:12:33.992682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" event={"ID":"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116","Type":"ContainerStarted","Data":"23e71a1349bdfdb1895ff6d92a199da04ca6f2ce560f864bbf68468eb486aca3"} Mar 18 16:12:34 crc kubenswrapper[4792]: I0318 16:12:34.032381 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" podStartSLOduration=1.541020799 podStartE2EDuration="2.032353411s" podCreationTimestamp="2026-03-18 16:12:32 +0000 UTC" firstStartedPulling="2026-03-18 16:12:33.072663256 +0000 UTC m=+2301.941992193" lastFinishedPulling="2026-03-18 16:12:33.563995868 +0000 UTC m=+2302.433324805" observedRunningTime="2026-03-18 16:12:34.021512381 +0000 UTC m=+2302.890841338" watchObservedRunningTime="2026-03-18 16:12:34.032353411 +0000 UTC m=+2302.901682348" Mar 18 16:12:37 crc kubenswrapper[4792]: I0318 16:12:37.855127 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:12:37 crc kubenswrapper[4792]: E0318 16:12:37.856035 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:12:48 crc kubenswrapper[4792]: I0318 16:12:48.611031 4792 scope.go:117] "RemoveContainer" containerID="9901a8f054b8edc0cb76beda24d571e1f848738729ad7ee4328cf25618e6ecbe" Mar 18 16:12:50 crc kubenswrapper[4792]: I0318 16:12:50.854503 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:12:50 crc kubenswrapper[4792]: E0318 16:12:50.855290 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:13:02 crc kubenswrapper[4792]: I0318 16:13:02.041726 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-x8s6w"] Mar 18 16:13:02 crc kubenswrapper[4792]: I0318 16:13:02.060421 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-x8s6w"] Mar 18 16:13:03 crc kubenswrapper[4792]: I0318 16:13:03.868354 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395511e9-6a0e-4101-8e72-87a46bf1218f" path="/var/lib/kubelet/pods/395511e9-6a0e-4101-8e72-87a46bf1218f/volumes" Mar 18 16:13:04 crc kubenswrapper[4792]: I0318 16:13:04.857923 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:13:04 crc kubenswrapper[4792]: E0318 16:13:04.858377 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:13:13 crc kubenswrapper[4792]: I0318 16:13:13.405177 4792 generic.go:334] "Generic (PLEG): container finished" podID="4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" containerID="60a8df6a82bf1e934c8e115262b8674e7df3181bb1e9c9a6cc76e0cde5ecb44f" exitCode=0 Mar 18 16:13:13 crc kubenswrapper[4792]: I0318 16:13:13.405340 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" event={"ID":"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116","Type":"ContainerDied","Data":"60a8df6a82bf1e934c8e115262b8674e7df3181bb1e9c9a6cc76e0cde5ecb44f"} Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.113808 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229305 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-ovn-default-certs-0\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229424 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ovn-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229468 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-libvirt-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229503 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-power-monitoring-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229540 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-repo-setup-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229707 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229739 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-bootstrap-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229795 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l75vj\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-kube-api-access-l75vj\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229842 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-neutron-metadata-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229892 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-inventory\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229934 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.229981 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.230040 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ssh-key-openstack-edpm-ipam\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.230099 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.230139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-nova-combined-ca-bundle\") pod \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\" (UID: \"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116\") " Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.238773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.238886 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.238914 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.238961 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.239284 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.239400 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-kube-api-access-l75vj" (OuterVolumeSpecName: "kube-api-access-l75vj") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "kube-api-access-l75vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.239422 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.239434 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.239769 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.242327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.242385 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.243697 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.243684 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.249484 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.272377 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.273074 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-inventory" (OuterVolumeSpecName: "inventory") pod "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" (UID: "4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333479 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333531 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333547 4792 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333562 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333579 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333592 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333604 4792 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333620 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333634 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333647 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333660 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333673 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l75vj\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-kube-api-access-l75vj\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333684 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333700 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333711 4792 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.333723 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.431212 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" event={"ID":"4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116","Type":"ContainerDied","Data":"23e71a1349bdfdb1895ff6d92a199da04ca6f2ce560f864bbf68468eb486aca3"} Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.431262 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e71a1349bdfdb1895ff6d92a199da04ca6f2ce560f864bbf68468eb486aca3" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.431326 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-77zpj" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.525444 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74"] Mar 18 16:13:15 crc kubenswrapper[4792]: E0318 16:13:15.526029 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.526054 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.526354 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.527387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.531629 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.532107 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.532128 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.532341 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.532763 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.547495 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74"] Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.640947 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.641013 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.641079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.641106 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc5h8\" (UniqueName: \"kubernetes.io/projected/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-kube-api-access-pc5h8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.641446 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.743665 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.743713 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc5h8\" (UniqueName: \"kubernetes.io/projected/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-kube-api-access-pc5h8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.743872 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.743937 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.743958 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.745014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.748546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.748951 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.749125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.765430 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc5h8\" (UniqueName: \"kubernetes.io/projected/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-kube-api-access-pc5h8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kgh74\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.855467 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:13:15 crc kubenswrapper[4792]: E0318 16:13:15.855800 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:13:15 crc kubenswrapper[4792]: I0318 16:13:15.891358 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:13:16 crc kubenswrapper[4792]: W0318 16:13:16.443620 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e02ebcc_002d_4a76_b1f6_12298f2beaa0.slice/crio-e3f2f55cdcc8e1c500aac7c41656f88ebd9261b3d52b3c7415423db9072db222 WatchSource:0}: Error finding container e3f2f55cdcc8e1c500aac7c41656f88ebd9261b3d52b3c7415423db9072db222: Status 404 returned error can't find the container with id e3f2f55cdcc8e1c500aac7c41656f88ebd9261b3d52b3c7415423db9072db222 Mar 18 16:13:16 crc kubenswrapper[4792]: I0318 16:13:16.451582 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74"] Mar 18 16:13:17 crc kubenswrapper[4792]: I0318 16:13:17.452877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" event={"ID":"8e02ebcc-002d-4a76-b1f6-12298f2beaa0","Type":"ContainerStarted","Data":"ff3b3104c5357a7556721b5ce1d2da69ea1b16cd3fe89b34a953f2504d4c80ff"} Mar 18 16:13:17 crc kubenswrapper[4792]: I0318 16:13:17.453466 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" event={"ID":"8e02ebcc-002d-4a76-b1f6-12298f2beaa0","Type":"ContainerStarted","Data":"e3f2f55cdcc8e1c500aac7c41656f88ebd9261b3d52b3c7415423db9072db222"} Mar 18 16:13:17 crc kubenswrapper[4792]: I0318 16:13:17.473802 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" podStartSLOduration=1.967020145 podStartE2EDuration="2.47377674s" podCreationTimestamp="2026-03-18 16:13:15 +0000 UTC" firstStartedPulling="2026-03-18 16:13:16.446367033 +0000 UTC m=+2345.315695980" lastFinishedPulling="2026-03-18 16:13:16.953123638 +0000 UTC m=+2345.822452575" observedRunningTime="2026-03-18 16:13:17.468467334 +0000 UTC m=+2346.337796281" watchObservedRunningTime="2026-03-18 16:13:17.47377674 +0000 UTC m=+2346.343105677" Mar 18 16:13:26 crc kubenswrapper[4792]: I0318 16:13:26.855058 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:13:26 crc kubenswrapper[4792]: E0318 16:13:26.855913 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:13:38 crc kubenswrapper[4792]: I0318 16:13:38.415512 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" podUID="423d82c6-fd0b-4cb5-8ff2-501f479a9a73" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:13:38 crc kubenswrapper[4792]: I0318 16:13:38.855304 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:13:38 crc kubenswrapper[4792]: E0318 16:13:38.855764 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:13:48 crc kubenswrapper[4792]: I0318 16:13:48.850720 4792 scope.go:117] "RemoveContainer" containerID="7ac2b924412360c31fdb7b224d2357e7bbc46b8a97725628fd5c83eb408a5065" Mar 18 16:13:50 crc kubenswrapper[4792]: I0318 16:13:50.042747 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-vjjkg"] Mar 18 16:13:50 crc kubenswrapper[4792]: I0318 16:13:50.053140 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-vjjkg"] Mar 18 16:13:50 crc kubenswrapper[4792]: I0318 16:13:50.855259 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:13:50 crc kubenswrapper[4792]: E0318 16:13:50.855609 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:13:51 crc kubenswrapper[4792]: I0318 16:13:51.875284 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840a93f2-c524-4d8e-a761-8a075a9266da" path="/var/lib/kubelet/pods/840a93f2-c524-4d8e-a761-8a075a9266da/volumes" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.151396 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564174-smhwj"] Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.155787 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-smhwj" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.158674 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.158736 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.159000 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.170273 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-smhwj"] Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.244606 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdvx2\" (UniqueName: \"kubernetes.io/projected/c829052d-b5b1-4943-b4e1-75a1f5ebc66b-kube-api-access-xdvx2\") pod \"auto-csr-approver-29564174-smhwj\" (UID: \"c829052d-b5b1-4943-b4e1-75a1f5ebc66b\") " pod="openshift-infra/auto-csr-approver-29564174-smhwj" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.347340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvx2\" (UniqueName: \"kubernetes.io/projected/c829052d-b5b1-4943-b4e1-75a1f5ebc66b-kube-api-access-xdvx2\") pod \"auto-csr-approver-29564174-smhwj\" (UID: \"c829052d-b5b1-4943-b4e1-75a1f5ebc66b\") " pod="openshift-infra/auto-csr-approver-29564174-smhwj" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.366159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdvx2\" (UniqueName: \"kubernetes.io/projected/c829052d-b5b1-4943-b4e1-75a1f5ebc66b-kube-api-access-xdvx2\") pod \"auto-csr-approver-29564174-smhwj\" (UID: \"c829052d-b5b1-4943-b4e1-75a1f5ebc66b\") " pod="openshift-infra/auto-csr-approver-29564174-smhwj" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.499127 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-smhwj" Mar 18 16:14:00 crc kubenswrapper[4792]: I0318 16:14:00.980900 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-smhwj"] Mar 18 16:14:01 crc kubenswrapper[4792]: I0318 16:14:01.062016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-smhwj" event={"ID":"c829052d-b5b1-4943-b4e1-75a1f5ebc66b","Type":"ContainerStarted","Data":"ffa15111588dfd930741a4405b2bf45b5ac840c5c148b65c121a412cc2c41f39"} Mar 18 16:14:01 crc kubenswrapper[4792]: I0318 16:14:01.865167 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:14:01 crc kubenswrapper[4792]: E0318 16:14:01.865507 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:14:03 crc kubenswrapper[4792]: I0318 16:14:03.085235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-smhwj" event={"ID":"c829052d-b5b1-4943-b4e1-75a1f5ebc66b","Type":"ContainerStarted","Data":"818d99250585d776b9ca0ef705eed762196731edc4c287e97eb130c260980c52"} Mar 18 16:14:03 crc kubenswrapper[4792]: I0318 16:14:03.107461 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564174-smhwj" podStartSLOduration=1.5169456609999998 podStartE2EDuration="3.107438634s" podCreationTimestamp="2026-03-18 16:14:00 +0000 UTC" firstStartedPulling="2026-03-18 16:14:00.992691387 +0000 UTC m=+2389.862020314" lastFinishedPulling="2026-03-18 16:14:02.58318435 +0000 UTC m=+2391.452513287" observedRunningTime="2026-03-18 16:14:03.103619805 +0000 UTC m=+2391.972948742" watchObservedRunningTime="2026-03-18 16:14:03.107438634 +0000 UTC m=+2391.976767571" Mar 18 16:14:04 crc kubenswrapper[4792]: I0318 16:14:04.096709 4792 generic.go:334] "Generic (PLEG): container finished" podID="c829052d-b5b1-4943-b4e1-75a1f5ebc66b" containerID="818d99250585d776b9ca0ef705eed762196731edc4c287e97eb130c260980c52" exitCode=0 Mar 18 16:14:04 crc kubenswrapper[4792]: I0318 16:14:04.096758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-smhwj" event={"ID":"c829052d-b5b1-4943-b4e1-75a1f5ebc66b","Type":"ContainerDied","Data":"818d99250585d776b9ca0ef705eed762196731edc4c287e97eb130c260980c52"} Mar 18 16:14:05 crc kubenswrapper[4792]: I0318 16:14:05.512619 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-smhwj" Mar 18 16:14:05 crc kubenswrapper[4792]: I0318 16:14:05.596254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdvx2\" (UniqueName: \"kubernetes.io/projected/c829052d-b5b1-4943-b4e1-75a1f5ebc66b-kube-api-access-xdvx2\") pod \"c829052d-b5b1-4943-b4e1-75a1f5ebc66b\" (UID: \"c829052d-b5b1-4943-b4e1-75a1f5ebc66b\") " Mar 18 16:14:05 crc kubenswrapper[4792]: I0318 16:14:05.608504 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c829052d-b5b1-4943-b4e1-75a1f5ebc66b-kube-api-access-xdvx2" (OuterVolumeSpecName: "kube-api-access-xdvx2") pod "c829052d-b5b1-4943-b4e1-75a1f5ebc66b" (UID: "c829052d-b5b1-4943-b4e1-75a1f5ebc66b"). InnerVolumeSpecName "kube-api-access-xdvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:14:05 crc kubenswrapper[4792]: I0318 16:14:05.699737 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdvx2\" (UniqueName: \"kubernetes.io/projected/c829052d-b5b1-4943-b4e1-75a1f5ebc66b-kube-api-access-xdvx2\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:06 crc kubenswrapper[4792]: I0318 16:14:06.119055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564174-smhwj" event={"ID":"c829052d-b5b1-4943-b4e1-75a1f5ebc66b","Type":"ContainerDied","Data":"ffa15111588dfd930741a4405b2bf45b5ac840c5c148b65c121a412cc2c41f39"} Mar 18 16:14:06 crc kubenswrapper[4792]: I0318 16:14:06.119094 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa15111588dfd930741a4405b2bf45b5ac840c5c148b65c121a412cc2c41f39" Mar 18 16:14:06 crc kubenswrapper[4792]: I0318 16:14:06.119360 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564174-smhwj" Mar 18 16:14:06 crc kubenswrapper[4792]: I0318 16:14:06.158757 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-qcsc6"] Mar 18 16:14:06 crc kubenswrapper[4792]: I0318 16:14:06.168497 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564168-qcsc6"] Mar 18 16:14:07 crc kubenswrapper[4792]: I0318 16:14:07.869182 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c867cce4-ccfb-4dc8-baed-fb506ed4d909" path="/var/lib/kubelet/pods/c867cce4-ccfb-4dc8-baed-fb506ed4d909/volumes" Mar 18 16:14:16 crc kubenswrapper[4792]: I0318 16:14:16.854403 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:14:16 crc kubenswrapper[4792]: E0318 16:14:16.855117 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:14:18 crc kubenswrapper[4792]: I0318 16:14:18.254893 4792 generic.go:334] "Generic (PLEG): container finished" podID="8e02ebcc-002d-4a76-b1f6-12298f2beaa0" containerID="ff3b3104c5357a7556721b5ce1d2da69ea1b16cd3fe89b34a953f2504d4c80ff" exitCode=0 Mar 18 16:14:18 crc kubenswrapper[4792]: I0318 16:14:18.254965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" event={"ID":"8e02ebcc-002d-4a76-b1f6-12298f2beaa0","Type":"ContainerDied","Data":"ff3b3104c5357a7556721b5ce1d2da69ea1b16cd3fe89b34a953f2504d4c80ff"} Mar 18 16:14:19 crc kubenswrapper[4792]: I0318 16:14:19.987700 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.103410 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc5h8\" (UniqueName: \"kubernetes.io/projected/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-kube-api-access-pc5h8\") pod \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.103783 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-inventory\") pod \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.103815 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ssh-key-openstack-edpm-ipam\") pod \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.103959 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovn-combined-ca-bundle\") pod \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.104047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovncontroller-config-0\") pod \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\" (UID: \"8e02ebcc-002d-4a76-b1f6-12298f2beaa0\") " Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.109468 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8e02ebcc-002d-4a76-b1f6-12298f2beaa0" (UID: "8e02ebcc-002d-4a76-b1f6-12298f2beaa0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.110246 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-kube-api-access-pc5h8" (OuterVolumeSpecName: "kube-api-access-pc5h8") pod "8e02ebcc-002d-4a76-b1f6-12298f2beaa0" (UID: "8e02ebcc-002d-4a76-b1f6-12298f2beaa0"). InnerVolumeSpecName "kube-api-access-pc5h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.138218 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8e02ebcc-002d-4a76-b1f6-12298f2beaa0" (UID: "8e02ebcc-002d-4a76-b1f6-12298f2beaa0"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.139847 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-inventory" (OuterVolumeSpecName: "inventory") pod "8e02ebcc-002d-4a76-b1f6-12298f2beaa0" (UID: "8e02ebcc-002d-4a76-b1f6-12298f2beaa0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.140351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e02ebcc-002d-4a76-b1f6-12298f2beaa0" (UID: "8e02ebcc-002d-4a76-b1f6-12298f2beaa0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.206942 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc5h8\" (UniqueName: \"kubernetes.io/projected/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-kube-api-access-pc5h8\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.206994 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.207004 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.207014 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.207023 4792 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8e02ebcc-002d-4a76-b1f6-12298f2beaa0-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.280149 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" event={"ID":"8e02ebcc-002d-4a76-b1f6-12298f2beaa0","Type":"ContainerDied","Data":"e3f2f55cdcc8e1c500aac7c41656f88ebd9261b3d52b3c7415423db9072db222"} Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.280193 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f2f55cdcc8e1c500aac7c41656f88ebd9261b3d52b3c7415423db9072db222" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.280217 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kgh74" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.381416 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb"] Mar 18 16:14:20 crc kubenswrapper[4792]: E0318 16:14:20.381905 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e02ebcc-002d-4a76-b1f6-12298f2beaa0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.381923 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e02ebcc-002d-4a76-b1f6-12298f2beaa0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 16:14:20 crc kubenswrapper[4792]: E0318 16:14:20.381996 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c829052d-b5b1-4943-b4e1-75a1f5ebc66b" containerName="oc" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.382007 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c829052d-b5b1-4943-b4e1-75a1f5ebc66b" containerName="oc" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.382234 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e02ebcc-002d-4a76-b1f6-12298f2beaa0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.382251 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c829052d-b5b1-4943-b4e1-75a1f5ebc66b" containerName="oc" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.383057 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.393676 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.393737 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.393696 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.393951 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.394000 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.393993 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.405287 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb"] Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.515880 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.516133 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.516313 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nptc\" (UniqueName: \"kubernetes.io/projected/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-kube-api-access-6nptc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.516406 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.516523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.516597 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.619166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.619243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.619363 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.619442 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.619505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nptc\" (UniqueName: \"kubernetes.io/projected/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-kube-api-access-6nptc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.619562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.625697 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.625771 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.630959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.631524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.638371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.652882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nptc\" (UniqueName: \"kubernetes.io/projected/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-kube-api-access-6nptc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:20 crc kubenswrapper[4792]: I0318 16:14:20.708262 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:14:21 crc kubenswrapper[4792]: I0318 16:14:21.295939 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb"] Mar 18 16:14:22 crc kubenswrapper[4792]: I0318 16:14:22.306098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" event={"ID":"1c471f93-cba4-46c2-9bdf-cb58f530f1a6","Type":"ContainerStarted","Data":"1e9960f69a729a1f7dc04f5e8bc19ff988faa86ad30bf00e493c172e272f7fe7"} Mar 18 16:14:22 crc kubenswrapper[4792]: I0318 16:14:22.307390 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" event={"ID":"1c471f93-cba4-46c2-9bdf-cb58f530f1a6","Type":"ContainerStarted","Data":"31f94fd55f20633d66415a7a0065aef76689a6f2b65db1c1c0f25449e0690a28"} Mar 18 16:14:22 crc kubenswrapper[4792]: I0318 16:14:22.327842 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" podStartSLOduration=1.659303081 podStartE2EDuration="2.327821318s" podCreationTimestamp="2026-03-18 16:14:20 +0000 UTC" firstStartedPulling="2026-03-18 16:14:21.304489198 +0000 UTC m=+2410.173818155" lastFinishedPulling="2026-03-18 16:14:21.973007455 +0000 UTC m=+2410.842336392" observedRunningTime="2026-03-18 16:14:22.32503233 +0000 UTC m=+2411.194361267" watchObservedRunningTime="2026-03-18 16:14:22.327821318 +0000 UTC m=+2411.197150255" Mar 18 16:14:30 crc kubenswrapper[4792]: I0318 16:14:30.854426 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:14:30 crc kubenswrapper[4792]: E0318 16:14:30.855345 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:14:43 crc kubenswrapper[4792]: I0318 16:14:43.854998 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:14:43 crc kubenswrapper[4792]: E0318 16:14:43.855759 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:14:48 crc kubenswrapper[4792]: I0318 16:14:48.976079 4792 scope.go:117] "RemoveContainer" containerID="de33aa65296ff72106b1bb7f0106193224eb85bacd2274dfc3dcb8f3cc30e2dd" Mar 18 16:14:49 crc kubenswrapper[4792]: I0318 16:14:49.014777 4792 scope.go:117] "RemoveContainer" containerID="8c470ec44517b7477672daeaf399d3ec1c62b0cdd4573ec6ad924f77e13eb28a" Mar 18 16:14:54 crc kubenswrapper[4792]: I0318 16:14:54.854383 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:14:54 crc kubenswrapper[4792]: E0318 16:14:54.855173 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.150227 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv"] Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.152921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.156418 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.159588 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.164483 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv"] Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.251199 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa5c39e-0993-45e7-9a44-e392347f3c05-secret-volume\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.251234 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa5c39e-0993-45e7-9a44-e392347f3c05-config-volume\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.251575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nks2w\" (UniqueName: \"kubernetes.io/projected/1fa5c39e-0993-45e7-9a44-e392347f3c05-kube-api-access-nks2w\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.353886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nks2w\" (UniqueName: \"kubernetes.io/projected/1fa5c39e-0993-45e7-9a44-e392347f3c05-kube-api-access-nks2w\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.353944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa5c39e-0993-45e7-9a44-e392347f3c05-secret-volume\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.353983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa5c39e-0993-45e7-9a44-e392347f3c05-config-volume\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.354863 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa5c39e-0993-45e7-9a44-e392347f3c05-config-volume\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.363951 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa5c39e-0993-45e7-9a44-e392347f3c05-secret-volume\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.371929 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nks2w\" (UniqueName: \"kubernetes.io/projected/1fa5c39e-0993-45e7-9a44-e392347f3c05-kube-api-access-nks2w\") pod \"collect-profiles-29564175-2w9pv\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:00 crc kubenswrapper[4792]: I0318 16:15:00.487999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:01 crc kubenswrapper[4792]: I0318 16:15:01.009874 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv"] Mar 18 16:15:01 crc kubenswrapper[4792]: I0318 16:15:01.715517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" event={"ID":"1fa5c39e-0993-45e7-9a44-e392347f3c05","Type":"ContainerStarted","Data":"dbfbe879da63272be5dfd0faad37ea9c110e9f8cd4dc8da4b1f7b620cac5e1bb"} Mar 18 16:15:01 crc kubenswrapper[4792]: I0318 16:15:01.715945 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" event={"ID":"1fa5c39e-0993-45e7-9a44-e392347f3c05","Type":"ContainerStarted","Data":"d0569dcf874d102fd94bbf1e071e304c777fe44bf6c7c120ffa69f43c886f8be"} Mar 18 16:15:01 crc kubenswrapper[4792]: I0318 16:15:01.734107 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" podStartSLOduration=1.73408628 podStartE2EDuration="1.73408628s" podCreationTimestamp="2026-03-18 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:15:01.732416108 +0000 UTC m=+2450.601745055" watchObservedRunningTime="2026-03-18 16:15:01.73408628 +0000 UTC m=+2450.603415227" Mar 18 16:15:02 crc kubenswrapper[4792]: I0318 16:15:02.730024 4792 generic.go:334] "Generic (PLEG): container finished" podID="1fa5c39e-0993-45e7-9a44-e392347f3c05" containerID="dbfbe879da63272be5dfd0faad37ea9c110e9f8cd4dc8da4b1f7b620cac5e1bb" exitCode=0 Mar 18 16:15:02 crc kubenswrapper[4792]: I0318 16:15:02.730086 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" event={"ID":"1fa5c39e-0993-45e7-9a44-e392347f3c05","Type":"ContainerDied","Data":"dbfbe879da63272be5dfd0faad37ea9c110e9f8cd4dc8da4b1f7b620cac5e1bb"} Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.152234 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.250841 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa5c39e-0993-45e7-9a44-e392347f3c05-secret-volume\") pod \"1fa5c39e-0993-45e7-9a44-e392347f3c05\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.250922 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa5c39e-0993-45e7-9a44-e392347f3c05-config-volume\") pod \"1fa5c39e-0993-45e7-9a44-e392347f3c05\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.251238 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nks2w\" (UniqueName: \"kubernetes.io/projected/1fa5c39e-0993-45e7-9a44-e392347f3c05-kube-api-access-nks2w\") pod \"1fa5c39e-0993-45e7-9a44-e392347f3c05\" (UID: \"1fa5c39e-0993-45e7-9a44-e392347f3c05\") " Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.253919 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa5c39e-0993-45e7-9a44-e392347f3c05-config-volume" (OuterVolumeSpecName: "config-volume") pod "1fa5c39e-0993-45e7-9a44-e392347f3c05" (UID: "1fa5c39e-0993-45e7-9a44-e392347f3c05"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.261081 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa5c39e-0993-45e7-9a44-e392347f3c05-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1fa5c39e-0993-45e7-9a44-e392347f3c05" (UID: "1fa5c39e-0993-45e7-9a44-e392347f3c05"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.278505 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa5c39e-0993-45e7-9a44-e392347f3c05-kube-api-access-nks2w" (OuterVolumeSpecName: "kube-api-access-nks2w") pod "1fa5c39e-0993-45e7-9a44-e392347f3c05" (UID: "1fa5c39e-0993-45e7-9a44-e392347f3c05"). InnerVolumeSpecName "kube-api-access-nks2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.354587 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fa5c39e-0993-45e7-9a44-e392347f3c05-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.354634 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fa5c39e-0993-45e7-9a44-e392347f3c05-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.354650 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nks2w\" (UniqueName: \"kubernetes.io/projected/1fa5c39e-0993-45e7-9a44-e392347f3c05-kube-api-access-nks2w\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.796019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" event={"ID":"1fa5c39e-0993-45e7-9a44-e392347f3c05","Type":"ContainerDied","Data":"d0569dcf874d102fd94bbf1e071e304c777fe44bf6c7c120ffa69f43c886f8be"} Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.796064 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.796345 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0569dcf874d102fd94bbf1e071e304c777fe44bf6c7c120ffa69f43c886f8be" Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.826279 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v"] Mar 18 16:15:04 crc kubenswrapper[4792]: I0318 16:15:04.838290 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564130-zxv4v"] Mar 18 16:15:05 crc kubenswrapper[4792]: I0318 16:15:05.806413 4792 generic.go:334] "Generic (PLEG): container finished" podID="1c471f93-cba4-46c2-9bdf-cb58f530f1a6" containerID="1e9960f69a729a1f7dc04f5e8bc19ff988faa86ad30bf00e493c172e272f7fe7" exitCode=0 Mar 18 16:15:05 crc kubenswrapper[4792]: I0318 16:15:05.806542 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" event={"ID":"1c471f93-cba4-46c2-9bdf-cb58f530f1a6","Type":"ContainerDied","Data":"1e9960f69a729a1f7dc04f5e8bc19ff988faa86ad30bf00e493c172e272f7fe7"} Mar 18 16:15:05 crc kubenswrapper[4792]: I0318 16:15:05.855221 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:15:05 crc kubenswrapper[4792]: E0318 16:15:05.855538 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:15:05 crc kubenswrapper[4792]: I0318 16:15:05.877888 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c77956c-88bf-4e94-a8de-a41728753ccd" path="/var/lib/kubelet/pods/2c77956c-88bf-4e94-a8de-a41728753ccd/volumes" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.311031 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.431756 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.431851 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-ssh-key-openstack-edpm-ipam\") pod \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.431905 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-nova-metadata-neutron-config-0\") pod \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.432193 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nptc\" (UniqueName: \"kubernetes.io/projected/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-kube-api-access-6nptc\") pod \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.432316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-metadata-combined-ca-bundle\") pod \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.432349 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-inventory\") pod \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\" (UID: \"1c471f93-cba4-46c2-9bdf-cb58f530f1a6\") " Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.438965 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1c471f93-cba4-46c2-9bdf-cb58f530f1a6" (UID: "1c471f93-cba4-46c2-9bdf-cb58f530f1a6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.446049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-kube-api-access-6nptc" (OuterVolumeSpecName: "kube-api-access-6nptc") pod "1c471f93-cba4-46c2-9bdf-cb58f530f1a6" (UID: "1c471f93-cba4-46c2-9bdf-cb58f530f1a6"). InnerVolumeSpecName "kube-api-access-6nptc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.467168 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1c471f93-cba4-46c2-9bdf-cb58f530f1a6" (UID: "1c471f93-cba4-46c2-9bdf-cb58f530f1a6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.472737 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1c471f93-cba4-46c2-9bdf-cb58f530f1a6" (UID: "1c471f93-cba4-46c2-9bdf-cb58f530f1a6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.475967 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1c471f93-cba4-46c2-9bdf-cb58f530f1a6" (UID: "1c471f93-cba4-46c2-9bdf-cb58f530f1a6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.483254 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-inventory" (OuterVolumeSpecName: "inventory") pod "1c471f93-cba4-46c2-9bdf-cb58f530f1a6" (UID: "1c471f93-cba4-46c2-9bdf-cb58f530f1a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.535032 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nptc\" (UniqueName: \"kubernetes.io/projected/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-kube-api-access-6nptc\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.535077 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.535095 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.535108 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.535121 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.535135 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c471f93-cba4-46c2-9bdf-cb58f530f1a6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.829575 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" event={"ID":"1c471f93-cba4-46c2-9bdf-cb58f530f1a6","Type":"ContainerDied","Data":"31f94fd55f20633d66415a7a0065aef76689a6f2b65db1c1c0f25449e0690a28"} Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.829614 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f94fd55f20633d66415a7a0065aef76689a6f2b65db1c1c0f25449e0690a28" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.829638 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.916106 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c"] Mar 18 16:15:07 crc kubenswrapper[4792]: E0318 16:15:07.916628 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa5c39e-0993-45e7-9a44-e392347f3c05" containerName="collect-profiles" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.916642 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa5c39e-0993-45e7-9a44-e392347f3c05" containerName="collect-profiles" Mar 18 16:15:07 crc kubenswrapper[4792]: E0318 16:15:07.916692 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c471f93-cba4-46c2-9bdf-cb58f530f1a6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.916700 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c471f93-cba4-46c2-9bdf-cb58f530f1a6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.916893 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c471f93-cba4-46c2-9bdf-cb58f530f1a6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.916941 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa5c39e-0993-45e7-9a44-e392347f3c05" containerName="collect-profiles" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.917861 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.920270 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.920580 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.920689 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.920810 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.920945 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:15:07 crc kubenswrapper[4792]: I0318 16:15:07.933022 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c"] Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.048456 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.048504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.048567 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.048958 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ljn7\" (UniqueName: \"kubernetes.io/projected/119672bf-abf7-4a5d-8aee-d3fde8085ed9-kube-api-access-8ljn7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.049160 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.151560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.151659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ljn7\" (UniqueName: \"kubernetes.io/projected/119672bf-abf7-4a5d-8aee-d3fde8085ed9-kube-api-access-8ljn7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.151737 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.152114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.152144 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.156625 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.156882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.157266 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.158127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.176773 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ljn7\" (UniqueName: \"kubernetes.io/projected/119672bf-abf7-4a5d-8aee-d3fde8085ed9-kube-api-access-8ljn7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wt24c\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.254558 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.776749 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c"] Mar 18 16:15:08 crc kubenswrapper[4792]: I0318 16:15:08.844513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" event={"ID":"119672bf-abf7-4a5d-8aee-d3fde8085ed9","Type":"ContainerStarted","Data":"a5dcb7a443cec04e60450fa24a5e7531c77162118d1c4b26bf9ada1e37de23da"} Mar 18 16:15:09 crc kubenswrapper[4792]: I0318 16:15:09.874483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" event={"ID":"119672bf-abf7-4a5d-8aee-d3fde8085ed9","Type":"ContainerStarted","Data":"2d1e4a4e71690995694704cbcf66addb6108d285522b06d6c838538148f32a67"} Mar 18 16:15:09 crc kubenswrapper[4792]: I0318 16:15:09.907707 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" podStartSLOduration=2.457116089 podStartE2EDuration="2.907668099s" podCreationTimestamp="2026-03-18 16:15:07 +0000 UTC" firstStartedPulling="2026-03-18 16:15:08.784367949 +0000 UTC m=+2457.653696886" lastFinishedPulling="2026-03-18 16:15:09.234919959 +0000 UTC m=+2458.104248896" observedRunningTime="2026-03-18 16:15:09.89119263 +0000 UTC m=+2458.760521577" watchObservedRunningTime="2026-03-18 16:15:09.907668099 +0000 UTC m=+2458.776997036" Mar 18 16:15:20 crc kubenswrapper[4792]: I0318 16:15:20.854598 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:15:20 crc kubenswrapper[4792]: E0318 16:15:20.855407 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:15:31 crc kubenswrapper[4792]: I0318 16:15:31.862601 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:15:31 crc kubenswrapper[4792]: E0318 16:15:31.863513 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:15:42 crc kubenswrapper[4792]: I0318 16:15:42.854805 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:15:42 crc kubenswrapper[4792]: E0318 16:15:42.855681 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:15:49 crc kubenswrapper[4792]: I0318 16:15:49.134856 4792 scope.go:117] "RemoveContainer" containerID="1bca90bfe8f505b35b43705bca61006e30650fd20a2a8fc4daa0d3a4946760d4" Mar 18 16:15:53 crc kubenswrapper[4792]: I0318 16:15:53.854230 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:15:53 crc kubenswrapper[4792]: E0318 16:15:53.855166 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.152068 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564176-dpsh9"] Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.154284 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-dpsh9" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.160639 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.161541 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.163785 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.170413 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-dpsh9"] Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.341250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8nqw\" (UniqueName: \"kubernetes.io/projected/ae37da08-4fe7-4848-9799-69148e19d197-kube-api-access-w8nqw\") pod \"auto-csr-approver-29564176-dpsh9\" (UID: \"ae37da08-4fe7-4848-9799-69148e19d197\") " pod="openshift-infra/auto-csr-approver-29564176-dpsh9" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.444628 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8nqw\" (UniqueName: \"kubernetes.io/projected/ae37da08-4fe7-4848-9799-69148e19d197-kube-api-access-w8nqw\") pod \"auto-csr-approver-29564176-dpsh9\" (UID: \"ae37da08-4fe7-4848-9799-69148e19d197\") " pod="openshift-infra/auto-csr-approver-29564176-dpsh9" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.471110 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8nqw\" (UniqueName: \"kubernetes.io/projected/ae37da08-4fe7-4848-9799-69148e19d197-kube-api-access-w8nqw\") pod \"auto-csr-approver-29564176-dpsh9\" (UID: \"ae37da08-4fe7-4848-9799-69148e19d197\") " pod="openshift-infra/auto-csr-approver-29564176-dpsh9" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.477136 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-dpsh9" Mar 18 16:16:00 crc kubenswrapper[4792]: I0318 16:16:00.998699 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-dpsh9"] Mar 18 16:16:01 crc kubenswrapper[4792]: I0318 16:16:01.792125 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-dpsh9" event={"ID":"ae37da08-4fe7-4848-9799-69148e19d197","Type":"ContainerStarted","Data":"535b3d8c5d0899e208c822be1321c0e3426e56aedc8d90b1e998a56f669ab0af"} Mar 18 16:16:05 crc kubenswrapper[4792]: I0318 16:16:05.107153 4792 generic.go:334] "Generic (PLEG): container finished" podID="ae37da08-4fe7-4848-9799-69148e19d197" containerID="8b0c22a07a4d35035ec51d45a28296c01f7521d5f4d6e0a7171875022fb3462c" exitCode=0 Mar 18 16:16:05 crc kubenswrapper[4792]: I0318 16:16:05.107264 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-dpsh9" event={"ID":"ae37da08-4fe7-4848-9799-69148e19d197","Type":"ContainerDied","Data":"8b0c22a07a4d35035ec51d45a28296c01f7521d5f4d6e0a7171875022fb3462c"} Mar 18 16:16:05 crc kubenswrapper[4792]: I0318 16:16:05.854545 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:16:05 crc kubenswrapper[4792]: E0318 16:16:05.855286 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:16:06 crc kubenswrapper[4792]: I0318 16:16:06.504445 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-dpsh9" Mar 18 16:16:06 crc kubenswrapper[4792]: I0318 16:16:06.609449 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8nqw\" (UniqueName: \"kubernetes.io/projected/ae37da08-4fe7-4848-9799-69148e19d197-kube-api-access-w8nqw\") pod \"ae37da08-4fe7-4848-9799-69148e19d197\" (UID: \"ae37da08-4fe7-4848-9799-69148e19d197\") " Mar 18 16:16:06 crc kubenswrapper[4792]: I0318 16:16:06.620248 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae37da08-4fe7-4848-9799-69148e19d197-kube-api-access-w8nqw" (OuterVolumeSpecName: "kube-api-access-w8nqw") pod "ae37da08-4fe7-4848-9799-69148e19d197" (UID: "ae37da08-4fe7-4848-9799-69148e19d197"). InnerVolumeSpecName "kube-api-access-w8nqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:16:06 crc kubenswrapper[4792]: I0318 16:16:06.713345 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8nqw\" (UniqueName: \"kubernetes.io/projected/ae37da08-4fe7-4848-9799-69148e19d197-kube-api-access-w8nqw\") on node \"crc\" DevicePath \"\"" Mar 18 16:16:07 crc kubenswrapper[4792]: I0318 16:16:07.128877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564176-dpsh9" event={"ID":"ae37da08-4fe7-4848-9799-69148e19d197","Type":"ContainerDied","Data":"535b3d8c5d0899e208c822be1321c0e3426e56aedc8d90b1e998a56f669ab0af"} Mar 18 16:16:07 crc kubenswrapper[4792]: I0318 16:16:07.129200 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535b3d8c5d0899e208c822be1321c0e3426e56aedc8d90b1e998a56f669ab0af" Mar 18 16:16:07 crc kubenswrapper[4792]: I0318 16:16:07.128934 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564176-dpsh9" Mar 18 16:16:07 crc kubenswrapper[4792]: I0318 16:16:07.591451 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-2kb6g"] Mar 18 16:16:07 crc kubenswrapper[4792]: I0318 16:16:07.602468 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564170-2kb6g"] Mar 18 16:16:07 crc kubenswrapper[4792]: I0318 16:16:07.900152 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22cbddd2-7778-4435-983d-cec16396df53" path="/var/lib/kubelet/pods/22cbddd2-7778-4435-983d-cec16396df53/volumes" Mar 18 16:16:19 crc kubenswrapper[4792]: I0318 16:16:19.854644 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:16:19 crc kubenswrapper[4792]: E0318 16:16:19.855531 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:16:33 crc kubenswrapper[4792]: I0318 16:16:33.855516 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:16:34 crc kubenswrapper[4792]: I0318 16:16:34.480920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"919815f8e807458116e4782122ae529e92a15bfdba2eb450ded44df40389711d"} Mar 18 16:16:49 crc kubenswrapper[4792]: I0318 16:16:49.216135 4792 scope.go:117] "RemoveContainer" containerID="7b531196521034eb6e3534d78385fd50ca336da9e96d216ddbac3e53adca1991" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.154303 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564178-th78d"] Mar 18 16:18:00 crc kubenswrapper[4792]: E0318 16:18:00.156269 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae37da08-4fe7-4848-9799-69148e19d197" containerName="oc" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.156306 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae37da08-4fe7-4848-9799-69148e19d197" containerName="oc" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.156901 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae37da08-4fe7-4848-9799-69148e19d197" containerName="oc" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.159050 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-th78d" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.162370 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.165807 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-th78d"] Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.168358 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.168555 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.188789 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxmp\" (UniqueName: \"kubernetes.io/projected/e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2-kube-api-access-gxxmp\") pod \"auto-csr-approver-29564178-th78d\" (UID: \"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2\") " pod="openshift-infra/auto-csr-approver-29564178-th78d" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.291131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxmp\" (UniqueName: \"kubernetes.io/projected/e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2-kube-api-access-gxxmp\") pod \"auto-csr-approver-29564178-th78d\" (UID: \"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2\") " pod="openshift-infra/auto-csr-approver-29564178-th78d" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.316126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxmp\" (UniqueName: \"kubernetes.io/projected/e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2-kube-api-access-gxxmp\") pod \"auto-csr-approver-29564178-th78d\" (UID: \"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2\") " pod="openshift-infra/auto-csr-approver-29564178-th78d" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.507126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-th78d" Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.986485 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-th78d"] Mar 18 16:18:00 crc kubenswrapper[4792]: I0318 16:18:00.992073 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:18:01 crc kubenswrapper[4792]: I0318 16:18:01.552359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-th78d" event={"ID":"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2","Type":"ContainerStarted","Data":"1cb04724453dd318b2c6623029beee642cf918897ccded4acf7c44eb9c9f1246"} Mar 18 16:18:03 crc kubenswrapper[4792]: I0318 16:18:03.584494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-th78d" event={"ID":"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2","Type":"ContainerStarted","Data":"e3068cf8f827d98403e8bc91f9452f3159beca1241b4bd6085fc544a45e34c04"} Mar 18 16:18:03 crc kubenswrapper[4792]: I0318 16:18:03.600148 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564178-th78d" podStartSLOduration=1.452630601 podStartE2EDuration="3.600128151s" podCreationTimestamp="2026-03-18 16:18:00 +0000 UTC" firstStartedPulling="2026-03-18 16:18:00.991791223 +0000 UTC m=+2629.861120160" lastFinishedPulling="2026-03-18 16:18:03.139288773 +0000 UTC m=+2632.008617710" observedRunningTime="2026-03-18 16:18:03.597495399 +0000 UTC m=+2632.466824356" watchObservedRunningTime="2026-03-18 16:18:03.600128151 +0000 UTC m=+2632.469457088" Mar 18 16:18:04 crc kubenswrapper[4792]: I0318 16:18:04.599948 4792 generic.go:334] "Generic (PLEG): container finished" podID="e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2" containerID="e3068cf8f827d98403e8bc91f9452f3159beca1241b4bd6085fc544a45e34c04" exitCode=0 Mar 18 16:18:04 crc kubenswrapper[4792]: I0318 16:18:04.600073 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-th78d" event={"ID":"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2","Type":"ContainerDied","Data":"e3068cf8f827d98403e8bc91f9452f3159beca1241b4bd6085fc544a45e34c04"} Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.031217 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-th78d" Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.139897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxxmp\" (UniqueName: \"kubernetes.io/projected/e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2-kube-api-access-gxxmp\") pod \"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2\" (UID: \"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2\") " Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.147241 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2-kube-api-access-gxxmp" (OuterVolumeSpecName: "kube-api-access-gxxmp") pod "e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2" (UID: "e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2"). InnerVolumeSpecName "kube-api-access-gxxmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.243647 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxxmp\" (UniqueName: \"kubernetes.io/projected/e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2-kube-api-access-gxxmp\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.622282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564178-th78d" event={"ID":"e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2","Type":"ContainerDied","Data":"1cb04724453dd318b2c6623029beee642cf918897ccded4acf7c44eb9c9f1246"} Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.622328 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb04724453dd318b2c6623029beee642cf918897ccded4acf7c44eb9c9f1246" Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.622399 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564178-th78d" Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.708033 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-97lzn"] Mar 18 16:18:06 crc kubenswrapper[4792]: I0318 16:18:06.743354 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564172-97lzn"] Mar 18 16:18:07 crc kubenswrapper[4792]: I0318 16:18:07.868547 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577931be-2eb9-4fc1-bbc3-9e552ede9dc7" path="/var/lib/kubelet/pods/577931be-2eb9-4fc1-bbc3-9e552ede9dc7/volumes" Mar 18 16:18:49 crc kubenswrapper[4792]: I0318 16:18:49.347715 4792 scope.go:117] "RemoveContainer" containerID="2cb7c3f4ee8e1682c4f0198e11c4087871da32cc1cd6dce9e2dfa86495df539f" Mar 18 16:18:55 crc kubenswrapper[4792]: I0318 16:18:55.166065 4792 generic.go:334] "Generic (PLEG): container finished" podID="119672bf-abf7-4a5d-8aee-d3fde8085ed9" containerID="2d1e4a4e71690995694704cbcf66addb6108d285522b06d6c838538148f32a67" exitCode=0 Mar 18 16:18:55 crc kubenswrapper[4792]: I0318 16:18:55.166641 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" event={"ID":"119672bf-abf7-4a5d-8aee-d3fde8085ed9","Type":"ContainerDied","Data":"2d1e4a4e71690995694704cbcf66addb6108d285522b06d6c838538148f32a67"} Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.739047 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.833823 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ljn7\" (UniqueName: \"kubernetes.io/projected/119672bf-abf7-4a5d-8aee-d3fde8085ed9-kube-api-access-8ljn7\") pod \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.833941 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-combined-ca-bundle\") pod \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.834077 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-inventory\") pod \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.834146 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-ssh-key-openstack-edpm-ipam\") pod \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.834166 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-secret-0\") pod \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\" (UID: \"119672bf-abf7-4a5d-8aee-d3fde8085ed9\") " Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.839339 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "119672bf-abf7-4a5d-8aee-d3fde8085ed9" (UID: "119672bf-abf7-4a5d-8aee-d3fde8085ed9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.839664 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119672bf-abf7-4a5d-8aee-d3fde8085ed9-kube-api-access-8ljn7" (OuterVolumeSpecName: "kube-api-access-8ljn7") pod "119672bf-abf7-4a5d-8aee-d3fde8085ed9" (UID: "119672bf-abf7-4a5d-8aee-d3fde8085ed9"). InnerVolumeSpecName "kube-api-access-8ljn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.873359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-inventory" (OuterVolumeSpecName: "inventory") pod "119672bf-abf7-4a5d-8aee-d3fde8085ed9" (UID: "119672bf-abf7-4a5d-8aee-d3fde8085ed9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.873771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "119672bf-abf7-4a5d-8aee-d3fde8085ed9" (UID: "119672bf-abf7-4a5d-8aee-d3fde8085ed9"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.879792 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "119672bf-abf7-4a5d-8aee-d3fde8085ed9" (UID: "119672bf-abf7-4a5d-8aee-d3fde8085ed9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.938164 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ljn7\" (UniqueName: \"kubernetes.io/projected/119672bf-abf7-4a5d-8aee-d3fde8085ed9-kube-api-access-8ljn7\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.938204 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.938215 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.938227 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:56 crc kubenswrapper[4792]: I0318 16:18:56.938239 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/119672bf-abf7-4a5d-8aee-d3fde8085ed9-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.201249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" event={"ID":"119672bf-abf7-4a5d-8aee-d3fde8085ed9","Type":"ContainerDied","Data":"a5dcb7a443cec04e60450fa24a5e7531c77162118d1c4b26bf9ada1e37de23da"} Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.201329 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5dcb7a443cec04e60450fa24a5e7531c77162118d1c4b26bf9ada1e37de23da" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.201461 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wt24c" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.300756 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7"] Mar 18 16:18:57 crc kubenswrapper[4792]: E0318 16:18:57.301297 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119672bf-abf7-4a5d-8aee-d3fde8085ed9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.301317 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="119672bf-abf7-4a5d-8aee-d3fde8085ed9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 16:18:57 crc kubenswrapper[4792]: E0318 16:18:57.301336 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2" containerName="oc" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.301342 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2" containerName="oc" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.304523 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="119672bf-abf7-4a5d-8aee-d3fde8085ed9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.304558 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2" containerName="oc" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.305601 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.307996 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.308804 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.309117 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.309271 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.309287 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.309359 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.309487 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.326248 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7"] Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454530 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454592 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgg8l\" (UniqueName: \"kubernetes.io/projected/bf85aca0-8253-402e-92dd-df87a3dbaf01-kube-api-access-mgg8l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454638 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454709 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454848 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454869 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454917 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.454955 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.455001 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.455026 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.557576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.557931 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558013 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558059 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558086 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558117 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558234 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgg8l\" (UniqueName: \"kubernetes.io/projected/bf85aca0-8253-402e-92dd-df87a3dbaf01-kube-api-access-mgg8l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558355 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.558396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.559853 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.562722 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.562863 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.562988 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.563385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.563945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.564067 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.565301 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.565546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.565962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.576923 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgg8l\" (UniqueName: \"kubernetes.io/projected/bf85aca0-8253-402e-92dd-df87a3dbaf01-kube-api-access-mgg8l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-btjw7\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:57 crc kubenswrapper[4792]: I0318 16:18:57.629012 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:18:58 crc kubenswrapper[4792]: I0318 16:18:58.191907 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7"] Mar 18 16:18:58 crc kubenswrapper[4792]: W0318 16:18:58.198547 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf85aca0_8253_402e_92dd_df87a3dbaf01.slice/crio-70ad5a8fca1d3da6e6ac4135c03f09dce6479d70e9d1fc6a0d196234ebff3fec WatchSource:0}: Error finding container 70ad5a8fca1d3da6e6ac4135c03f09dce6479d70e9d1fc6a0d196234ebff3fec: Status 404 returned error can't find the container with id 70ad5a8fca1d3da6e6ac4135c03f09dce6479d70e9d1fc6a0d196234ebff3fec Mar 18 16:18:59 crc kubenswrapper[4792]: I0318 16:18:59.218029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" event={"ID":"bf85aca0-8253-402e-92dd-df87a3dbaf01","Type":"ContainerStarted","Data":"70ad5a8fca1d3da6e6ac4135c03f09dce6479d70e9d1fc6a0d196234ebff3fec"} Mar 18 16:19:00 crc kubenswrapper[4792]: I0318 16:19:00.231084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" event={"ID":"bf85aca0-8253-402e-92dd-df87a3dbaf01","Type":"ContainerStarted","Data":"aa0f619a5ab16e1ce76fe7d7f19342c5efbe774f72024a46668e3ac3dc6a81e9"} Mar 18 16:19:00 crc kubenswrapper[4792]: I0318 16:19:00.267724 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" podStartSLOduration=2.183850338 podStartE2EDuration="3.267704081s" podCreationTimestamp="2026-03-18 16:18:57 +0000 UTC" firstStartedPulling="2026-03-18 16:18:58.207768587 +0000 UTC m=+2687.077097524" lastFinishedPulling="2026-03-18 16:18:59.29162233 +0000 UTC m=+2688.160951267" observedRunningTime="2026-03-18 16:19:00.256615711 +0000 UTC m=+2689.125944648" watchObservedRunningTime="2026-03-18 16:19:00.267704081 +0000 UTC m=+2689.137033018" Mar 18 16:19:00 crc kubenswrapper[4792]: I0318 16:19:00.321933 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:19:00 crc kubenswrapper[4792]: I0318 16:19:00.322005 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:19:30 crc kubenswrapper[4792]: I0318 16:19:30.321916 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:19:30 crc kubenswrapper[4792]: I0318 16:19:30.322382 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.179251 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vp7sb"] Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.182705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.197023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vp7sb"] Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.255894 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dknl\" (UniqueName: \"kubernetes.io/projected/a8a131ee-9ed6-4243-b856-c688d4fd9b89-kube-api-access-8dknl\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.256228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-catalog-content\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.256328 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-utilities\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.358846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-utilities\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.359025 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dknl\" (UniqueName: \"kubernetes.io/projected/a8a131ee-9ed6-4243-b856-c688d4fd9b89-kube-api-access-8dknl\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.359161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-catalog-content\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.359427 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-utilities\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.359640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-catalog-content\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.379427 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dknl\" (UniqueName: \"kubernetes.io/projected/a8a131ee-9ed6-4243-b856-c688d4fd9b89-kube-api-access-8dknl\") pod \"redhat-operators-vp7sb\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:50 crc kubenswrapper[4792]: I0318 16:19:50.517426 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:19:51 crc kubenswrapper[4792]: I0318 16:19:51.071823 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vp7sb"] Mar 18 16:19:51 crc kubenswrapper[4792]: I0318 16:19:51.164571 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vp7sb" event={"ID":"a8a131ee-9ed6-4243-b856-c688d4fd9b89","Type":"ContainerStarted","Data":"218ebac793635765d1daf991b0164bb238b5d0abfd7f5c2bc22514eb75176a0e"} Mar 18 16:19:52 crc kubenswrapper[4792]: I0318 16:19:52.176703 4792 generic.go:334] "Generic (PLEG): container finished" podID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerID="25c1c7823d5d3b5f3c0327d275bfaed6d6aaf2a721768e8b2ac3919a1b2e8623" exitCode=0 Mar 18 16:19:52 crc kubenswrapper[4792]: I0318 16:19:52.176888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vp7sb" event={"ID":"a8a131ee-9ed6-4243-b856-c688d4fd9b89","Type":"ContainerDied","Data":"25c1c7823d5d3b5f3c0327d275bfaed6d6aaf2a721768e8b2ac3919a1b2e8623"} Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.150788 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564180-66652"] Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.154139 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-66652" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.156244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.156560 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.156571 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.168717 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-66652"] Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.198158 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhswt\" (UniqueName: \"kubernetes.io/projected/7eeeae2e-ecbf-4682-912b-0835922d05a7-kube-api-access-nhswt\") pod \"auto-csr-approver-29564180-66652\" (UID: \"7eeeae2e-ecbf-4682-912b-0835922d05a7\") " pod="openshift-infra/auto-csr-approver-29564180-66652" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.301054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhswt\" (UniqueName: \"kubernetes.io/projected/7eeeae2e-ecbf-4682-912b-0835922d05a7-kube-api-access-nhswt\") pod \"auto-csr-approver-29564180-66652\" (UID: \"7eeeae2e-ecbf-4682-912b-0835922d05a7\") " pod="openshift-infra/auto-csr-approver-29564180-66652" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.322229 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.322346 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.322416 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.323639 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"919815f8e807458116e4782122ae529e92a15bfdba2eb450ded44df40389711d"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.323720 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://919815f8e807458116e4782122ae529e92a15bfdba2eb450ded44df40389711d" gracePeriod=600 Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.324059 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhswt\" (UniqueName: \"kubernetes.io/projected/7eeeae2e-ecbf-4682-912b-0835922d05a7-kube-api-access-nhswt\") pod \"auto-csr-approver-29564180-66652\" (UID: \"7eeeae2e-ecbf-4682-912b-0835922d05a7\") " pod="openshift-infra/auto-csr-approver-29564180-66652" Mar 18 16:20:00 crc kubenswrapper[4792]: I0318 16:20:00.486205 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-66652" Mar 18 16:20:01 crc kubenswrapper[4792]: I0318 16:20:01.289422 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="919815f8e807458116e4782122ae529e92a15bfdba2eb450ded44df40389711d" exitCode=0 Mar 18 16:20:01 crc kubenswrapper[4792]: I0318 16:20:01.289605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"919815f8e807458116e4782122ae529e92a15bfdba2eb450ded44df40389711d"} Mar 18 16:20:01 crc kubenswrapper[4792]: I0318 16:20:01.289751 4792 scope.go:117] "RemoveContainer" containerID="befaa0b913161a07b9011e3fc28516b3325e99dd93f82170774c9df03fbe2e27" Mar 18 16:20:09 crc kubenswrapper[4792]: E0318 16:20:09.258802 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 16:20:09 crc kubenswrapper[4792]: E0318 16:20:09.259608 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8dknl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vp7sb_openshift-marketplace(a8a131ee-9ed6-4243-b856-c688d4fd9b89): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 16:20:09 crc kubenswrapper[4792]: E0318 16:20:09.260926 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vp7sb" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" Mar 18 16:20:09 crc kubenswrapper[4792]: E0318 16:20:09.413171 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vp7sb" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" Mar 18 16:20:09 crc kubenswrapper[4792]: I0318 16:20:09.639896 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-66652"] Mar 18 16:20:10 crc kubenswrapper[4792]: I0318 16:20:10.422193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247"} Mar 18 16:20:10 crc kubenswrapper[4792]: I0318 16:20:10.424285 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-66652" event={"ID":"7eeeae2e-ecbf-4682-912b-0835922d05a7","Type":"ContainerStarted","Data":"a1cb5dc62e9cf679e4aea4ed1a5a4f462f6c1639daaa15c68aec055c7880f71c"} Mar 18 16:20:11 crc kubenswrapper[4792]: I0318 16:20:11.444178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-66652" event={"ID":"7eeeae2e-ecbf-4682-912b-0835922d05a7","Type":"ContainerStarted","Data":"f8b40ef76e5a170fb941b2fdb12acfc281e0f73ad80d1fa2211a689d1e7af66c"} Mar 18 16:20:11 crc kubenswrapper[4792]: I0318 16:20:11.464923 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564180-66652" podStartSLOduration=10.135478621 podStartE2EDuration="11.464899345s" podCreationTimestamp="2026-03-18 16:20:00 +0000 UTC" firstStartedPulling="2026-03-18 16:20:09.641050035 +0000 UTC m=+2758.510378972" lastFinishedPulling="2026-03-18 16:20:10.970470719 +0000 UTC m=+2759.839799696" observedRunningTime="2026-03-18 16:20:11.461093615 +0000 UTC m=+2760.330422552" watchObservedRunningTime="2026-03-18 16:20:11.464899345 +0000 UTC m=+2760.334228282" Mar 18 16:20:12 crc kubenswrapper[4792]: I0318 16:20:12.458446 4792 generic.go:334] "Generic (PLEG): container finished" podID="7eeeae2e-ecbf-4682-912b-0835922d05a7" containerID="f8b40ef76e5a170fb941b2fdb12acfc281e0f73ad80d1fa2211a689d1e7af66c" exitCode=0 Mar 18 16:20:12 crc kubenswrapper[4792]: I0318 16:20:12.458514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-66652" event={"ID":"7eeeae2e-ecbf-4682-912b-0835922d05a7","Type":"ContainerDied","Data":"f8b40ef76e5a170fb941b2fdb12acfc281e0f73ad80d1fa2211a689d1e7af66c"} Mar 18 16:20:13 crc kubenswrapper[4792]: I0318 16:20:13.906248 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-66652" Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.070929 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhswt\" (UniqueName: \"kubernetes.io/projected/7eeeae2e-ecbf-4682-912b-0835922d05a7-kube-api-access-nhswt\") pod \"7eeeae2e-ecbf-4682-912b-0835922d05a7\" (UID: \"7eeeae2e-ecbf-4682-912b-0835922d05a7\") " Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.090831 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eeeae2e-ecbf-4682-912b-0835922d05a7-kube-api-access-nhswt" (OuterVolumeSpecName: "kube-api-access-nhswt") pod "7eeeae2e-ecbf-4682-912b-0835922d05a7" (UID: "7eeeae2e-ecbf-4682-912b-0835922d05a7"). InnerVolumeSpecName "kube-api-access-nhswt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.174799 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhswt\" (UniqueName: \"kubernetes.io/projected/7eeeae2e-ecbf-4682-912b-0835922d05a7-kube-api-access-nhswt\") on node \"crc\" DevicePath \"\"" Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.481461 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564180-66652" event={"ID":"7eeeae2e-ecbf-4682-912b-0835922d05a7","Type":"ContainerDied","Data":"a1cb5dc62e9cf679e4aea4ed1a5a4f462f6c1639daaa15c68aec055c7880f71c"} Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.481509 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1cb5dc62e9cf679e4aea4ed1a5a4f462f6c1639daaa15c68aec055c7880f71c" Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.481522 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564180-66652" Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.545397 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-smhwj"] Mar 18 16:20:14 crc kubenswrapper[4792]: I0318 16:20:14.558814 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564174-smhwj"] Mar 18 16:20:15 crc kubenswrapper[4792]: I0318 16:20:15.872939 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c829052d-b5b1-4943-b4e1-75a1f5ebc66b" path="/var/lib/kubelet/pods/c829052d-b5b1-4943-b4e1-75a1f5ebc66b/volumes" Mar 18 16:20:29 crc kubenswrapper[4792]: I0318 16:20:29.650480 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vp7sb" event={"ID":"a8a131ee-9ed6-4243-b856-c688d4fd9b89","Type":"ContainerStarted","Data":"11bcfe2e652fb3b0ba293d65c6d50de8f4ba1033b18f5dcc2476a348640124ea"} Mar 18 16:20:35 crc kubenswrapper[4792]: I0318 16:20:35.721148 4792 generic.go:334] "Generic (PLEG): container finished" podID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerID="11bcfe2e652fb3b0ba293d65c6d50de8f4ba1033b18f5dcc2476a348640124ea" exitCode=0 Mar 18 16:20:35 crc kubenswrapper[4792]: I0318 16:20:35.721233 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vp7sb" event={"ID":"a8a131ee-9ed6-4243-b856-c688d4fd9b89","Type":"ContainerDied","Data":"11bcfe2e652fb3b0ba293d65c6d50de8f4ba1033b18f5dcc2476a348640124ea"} Mar 18 16:20:36 crc kubenswrapper[4792]: I0318 16:20:36.740077 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vp7sb" event={"ID":"a8a131ee-9ed6-4243-b856-c688d4fd9b89","Type":"ContainerStarted","Data":"d1ca4c6e10e20d178e271c3882dbe33690af31be007849b0397258e893b23bc2"} Mar 18 16:20:36 crc kubenswrapper[4792]: I0318 16:20:36.772790 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vp7sb" podStartSLOduration=2.829679041 podStartE2EDuration="46.772766443s" podCreationTimestamp="2026-03-18 16:19:50 +0000 UTC" firstStartedPulling="2026-03-18 16:19:52.178834851 +0000 UTC m=+2741.048163788" lastFinishedPulling="2026-03-18 16:20:36.121922253 +0000 UTC m=+2784.991251190" observedRunningTime="2026-03-18 16:20:36.765263297 +0000 UTC m=+2785.634592244" watchObservedRunningTime="2026-03-18 16:20:36.772766443 +0000 UTC m=+2785.642095380" Mar 18 16:20:40 crc kubenswrapper[4792]: I0318 16:20:40.517783 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:20:40 crc kubenswrapper[4792]: I0318 16:20:40.519521 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:20:41 crc kubenswrapper[4792]: I0318 16:20:41.564225 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vp7sb" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="registry-server" probeResult="failure" output=< Mar 18 16:20:41 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:20:41 crc kubenswrapper[4792]: > Mar 18 16:20:49 crc kubenswrapper[4792]: I0318 16:20:49.448116 4792 scope.go:117] "RemoveContainer" containerID="818d99250585d776b9ca0ef705eed762196731edc4c287e97eb130c260980c52" Mar 18 16:20:51 crc kubenswrapper[4792]: I0318 16:20:51.577800 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vp7sb" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="registry-server" probeResult="failure" output=< Mar 18 16:20:51 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:20:51 crc kubenswrapper[4792]: > Mar 18 16:21:01 crc kubenswrapper[4792]: I0318 16:21:01.571722 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vp7sb" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="registry-server" probeResult="failure" output=< Mar 18 16:21:01 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:21:01 crc kubenswrapper[4792]: > Mar 18 16:21:10 crc kubenswrapper[4792]: I0318 16:21:10.592325 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:21:10 crc kubenswrapper[4792]: I0318 16:21:10.665352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:21:10 crc kubenswrapper[4792]: I0318 16:21:10.732754 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vp7sb"] Mar 18 16:21:10 crc kubenswrapper[4792]: I0318 16:21:10.848464 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4t4j4"] Mar 18 16:21:10 crc kubenswrapper[4792]: I0318 16:21:10.848716 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4t4j4" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="registry-server" containerID="cri-o://d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d" gracePeriod=2 Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.130885 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerID="d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d" exitCode=0 Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.131172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t4j4" event={"ID":"f1d558b7-3e50-46fb-b7b8-269c00392479","Type":"ContainerDied","Data":"d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d"} Mar 18 16:21:11 crc kubenswrapper[4792]: E0318 16:21:11.344701 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d is running failed: container process not found" containerID="d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 16:21:11 crc kubenswrapper[4792]: E0318 16:21:11.348255 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d is running failed: container process not found" containerID="d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 16:21:11 crc kubenswrapper[4792]: E0318 16:21:11.348733 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d is running failed: container process not found" containerID="d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d" cmd=["grpc_health_probe","-addr=:50051"] Mar 18 16:21:11 crc kubenswrapper[4792]: E0318 16:21:11.348776 4792 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-4t4j4" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="registry-server" Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.521091 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.548581 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-catalog-content\") pod \"f1d558b7-3e50-46fb-b7b8-269c00392479\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.548789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-utilities\") pod \"f1d558b7-3e50-46fb-b7b8-269c00392479\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.549320 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzpx\" (UniqueName: \"kubernetes.io/projected/f1d558b7-3e50-46fb-b7b8-269c00392479-kube-api-access-qkzpx\") pod \"f1d558b7-3e50-46fb-b7b8-269c00392479\" (UID: \"f1d558b7-3e50-46fb-b7b8-269c00392479\") " Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.549610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-utilities" (OuterVolumeSpecName: "utilities") pod "f1d558b7-3e50-46fb-b7b8-269c00392479" (UID: "f1d558b7-3e50-46fb-b7b8-269c00392479"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.550582 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.576492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d558b7-3e50-46fb-b7b8-269c00392479-kube-api-access-qkzpx" (OuterVolumeSpecName: "kube-api-access-qkzpx") pod "f1d558b7-3e50-46fb-b7b8-269c00392479" (UID: "f1d558b7-3e50-46fb-b7b8-269c00392479"). InnerVolumeSpecName "kube-api-access-qkzpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.652728 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzpx\" (UniqueName: \"kubernetes.io/projected/f1d558b7-3e50-46fb-b7b8-269c00392479-kube-api-access-qkzpx\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.737963 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1d558b7-3e50-46fb-b7b8-269c00392479" (UID: "f1d558b7-3e50-46fb-b7b8-269c00392479"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:21:11 crc kubenswrapper[4792]: I0318 16:21:11.755303 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d558b7-3e50-46fb-b7b8-269c00392479-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:12 crc kubenswrapper[4792]: I0318 16:21:12.142962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4t4j4" event={"ID":"f1d558b7-3e50-46fb-b7b8-269c00392479","Type":"ContainerDied","Data":"498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5"} Mar 18 16:21:12 crc kubenswrapper[4792]: I0318 16:21:12.143087 4792 scope.go:117] "RemoveContainer" containerID="d96d2e864a0ec547bc8a8c8a07ee035abaa7d1801e12e34596e778d981ef456d" Mar 18 16:21:12 crc kubenswrapper[4792]: I0318 16:21:12.143404 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4t4j4" Mar 18 16:21:12 crc kubenswrapper[4792]: I0318 16:21:12.177264 4792 scope.go:117] "RemoveContainer" containerID="cf62489ac4be7e3a63f115eac4e25263b7b85c71892d0b1bf0c757d655691ca4" Mar 18 16:21:12 crc kubenswrapper[4792]: I0318 16:21:12.185256 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4t4j4"] Mar 18 16:21:12 crc kubenswrapper[4792]: I0318 16:21:12.198605 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4t4j4"] Mar 18 16:21:12 crc kubenswrapper[4792]: I0318 16:21:12.204524 4792 scope.go:117] "RemoveContainer" containerID="0c7680dc29e7ad6fc88180e236055e23537c722aa649d820e62cda547e1199ec" Mar 18 16:21:13 crc kubenswrapper[4792]: I0318 16:21:13.890151 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" path="/var/lib/kubelet/pods/f1d558b7-3e50-46fb-b7b8-269c00392479/volumes" Mar 18 16:21:16 crc kubenswrapper[4792]: E0318 16:21:16.269114 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:24 crc kubenswrapper[4792]: E0318 16:21:24.213373 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:25 crc kubenswrapper[4792]: I0318 16:21:25.276648 4792 generic.go:334] "Generic (PLEG): container finished" podID="bf85aca0-8253-402e-92dd-df87a3dbaf01" containerID="aa0f619a5ab16e1ce76fe7d7f19342c5efbe774f72024a46668e3ac3dc6a81e9" exitCode=0 Mar 18 16:21:25 crc kubenswrapper[4792]: I0318 16:21:25.276751 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" event={"ID":"bf85aca0-8253-402e-92dd-df87a3dbaf01","Type":"ContainerDied","Data":"aa0f619a5ab16e1ce76fe7d7f19342c5efbe774f72024a46668e3ac3dc6a81e9"} Mar 18 16:21:26 crc kubenswrapper[4792]: E0318 16:21:26.317483 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:26 crc kubenswrapper[4792]: I0318 16:21:26.877962 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.063924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-0\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-1\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064079 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-0\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064131 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgg8l\" (UniqueName: \"kubernetes.io/projected/bf85aca0-8253-402e-92dd-df87a3dbaf01-kube-api-access-mgg8l\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064151 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-inventory\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-extra-config-0\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064287 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-combined-ca-bundle\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064353 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-3\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064431 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-2\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-1\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.064489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-ssh-key-openstack-edpm-ipam\") pod \"bf85aca0-8253-402e-92dd-df87a3dbaf01\" (UID: \"bf85aca0-8253-402e-92dd-df87a3dbaf01\") " Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.078869 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf85aca0-8253-402e-92dd-df87a3dbaf01-kube-api-access-mgg8l" (OuterVolumeSpecName: "kube-api-access-mgg8l") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "kube-api-access-mgg8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.082223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.108503 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-inventory" (OuterVolumeSpecName: "inventory") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.110174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.112280 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.112460 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.117983 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.121713 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.128211 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.130223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.138111 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "bf85aca0-8253-402e-92dd-df87a3dbaf01" (UID: "bf85aca0-8253-402e-92dd-df87a3dbaf01"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167248 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167477 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167538 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167646 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167722 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167808 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167890 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgg8l\" (UniqueName: \"kubernetes.io/projected/bf85aca0-8253-402e-92dd-df87a3dbaf01-kube-api-access-mgg8l\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.167995 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.168090 4792 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.168170 4792 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.168234 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bf85aca0-8253-402e-92dd-df87a3dbaf01-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.298005 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" event={"ID":"bf85aca0-8253-402e-92dd-df87a3dbaf01","Type":"ContainerDied","Data":"70ad5a8fca1d3da6e6ac4135c03f09dce6479d70e9d1fc6a0d196234ebff3fec"} Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.298048 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ad5a8fca1d3da6e6ac4135c03f09dce6479d70e9d1fc6a0d196234ebff3fec" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.298360 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-btjw7" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.397664 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp"] Mar 18 16:21:27 crc kubenswrapper[4792]: E0318 16:21:27.398202 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf85aca0-8253-402e-92dd-df87a3dbaf01" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398220 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf85aca0-8253-402e-92dd-df87a3dbaf01" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 16:21:27 crc kubenswrapper[4792]: E0318 16:21:27.398246 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="extract-content" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398251 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="extract-content" Mar 18 16:21:27 crc kubenswrapper[4792]: E0318 16:21:27.398268 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eeeae2e-ecbf-4682-912b-0835922d05a7" containerName="oc" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398274 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeeae2e-ecbf-4682-912b-0835922d05a7" containerName="oc" Mar 18 16:21:27 crc kubenswrapper[4792]: E0318 16:21:27.398302 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="extract-utilities" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398309 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="extract-utilities" Mar 18 16:21:27 crc kubenswrapper[4792]: E0318 16:21:27.398326 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="registry-server" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398332 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="registry-server" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398545 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d558b7-3e50-46fb-b7b8-269c00392479" containerName="registry-server" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398557 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf85aca0-8253-402e-92dd-df87a3dbaf01" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.398585 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eeeae2e-ecbf-4682-912b-0835922d05a7" containerName="oc" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.399609 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.407074 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.407674 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.407734 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.407862 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.408218 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp"] Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.409178 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.577554 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.578380 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.578512 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.578540 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.578584 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.578724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.578765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfwjr\" (UniqueName: \"kubernetes.io/projected/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-kube-api-access-lfwjr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.681300 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.681378 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.681402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.681419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.681481 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.681508 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfwjr\" (UniqueName: \"kubernetes.io/projected/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-kube-api-access-lfwjr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.681556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.687934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.688001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.689740 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.692838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.698524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.704022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfwjr\" (UniqueName: \"kubernetes.io/projected/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-kube-api-access-lfwjr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.704133 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:27 crc kubenswrapper[4792]: I0318 16:21:27.727653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:21:28 crc kubenswrapper[4792]: I0318 16:21:28.334171 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp"] Mar 18 16:21:29 crc kubenswrapper[4792]: I0318 16:21:29.323303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" event={"ID":"c675fc82-9ad3-4eae-8918-00ab1d6fd06d","Type":"ContainerStarted","Data":"84384f49e3bc0b808a8909cc10cf6cce23708c405c529251ca5d08594b412097"} Mar 18 16:21:29 crc kubenswrapper[4792]: I0318 16:21:29.323661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" event={"ID":"c675fc82-9ad3-4eae-8918-00ab1d6fd06d","Type":"ContainerStarted","Data":"364aea36fcc3066d844ca2a310299d2cc649f644e2b917e7f631ab09d58ca453"} Mar 18 16:21:29 crc kubenswrapper[4792]: I0318 16:21:29.346387 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" podStartSLOduration=1.87185551 podStartE2EDuration="2.34636663s" podCreationTimestamp="2026-03-18 16:21:27 +0000 UTC" firstStartedPulling="2026-03-18 16:21:28.341191564 +0000 UTC m=+2837.210520501" lastFinishedPulling="2026-03-18 16:21:28.815702684 +0000 UTC m=+2837.685031621" observedRunningTime="2026-03-18 16:21:29.345078249 +0000 UTC m=+2838.214407196" watchObservedRunningTime="2026-03-18 16:21:29.34636663 +0000 UTC m=+2838.215695567" Mar 18 16:21:36 crc kubenswrapper[4792]: E0318 16:21:36.615172 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:38 crc kubenswrapper[4792]: E0318 16:21:38.952797 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:46 crc kubenswrapper[4792]: E0318 16:21:46.899884 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:48 crc kubenswrapper[4792]: E0318 16:21:48.104655 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:48 crc kubenswrapper[4792]: E0318 16:21:48.105008 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:54 crc kubenswrapper[4792]: E0318 16:21:54.247013 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:21:56 crc kubenswrapper[4792]: E0318 16:21:56.944631 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache]" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.145428 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564182-p8crf"] Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.147273 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-p8crf" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.150134 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.150150 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.151021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.159176 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-p8crf"] Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.253920 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8nt\" (UniqueName: \"kubernetes.io/projected/82c0482d-5b4f-41f8-b10a-817c7f129804-kube-api-access-pf8nt\") pod \"auto-csr-approver-29564182-p8crf\" (UID: \"82c0482d-5b4f-41f8-b10a-817c7f129804\") " pod="openshift-infra/auto-csr-approver-29564182-p8crf" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.356191 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8nt\" (UniqueName: \"kubernetes.io/projected/82c0482d-5b4f-41f8-b10a-817c7f129804-kube-api-access-pf8nt\") pod \"auto-csr-approver-29564182-p8crf\" (UID: \"82c0482d-5b4f-41f8-b10a-817c7f129804\") " pod="openshift-infra/auto-csr-approver-29564182-p8crf" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.376349 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8nt\" (UniqueName: \"kubernetes.io/projected/82c0482d-5b4f-41f8-b10a-817c7f129804-kube-api-access-pf8nt\") pod \"auto-csr-approver-29564182-p8crf\" (UID: \"82c0482d-5b4f-41f8-b10a-817c7f129804\") " pod="openshift-infra/auto-csr-approver-29564182-p8crf" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.470668 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-p8crf" Mar 18 16:22:00 crc kubenswrapper[4792]: I0318 16:22:00.966173 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-p8crf"] Mar 18 16:22:01 crc kubenswrapper[4792]: I0318 16:22:01.311925 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-p8crf" event={"ID":"82c0482d-5b4f-41f8-b10a-817c7f129804","Type":"ContainerStarted","Data":"08ee6b135a44c08b5a3afc0ae82cbf0d774e6fbc21c82050965822b1f6e1fd49"} Mar 18 16:22:03 crc kubenswrapper[4792]: I0318 16:22:03.337215 4792 generic.go:334] "Generic (PLEG): container finished" podID="82c0482d-5b4f-41f8-b10a-817c7f129804" containerID="c8ee1e138e5b3d4c65f557e82fb155486fa6e7629c21c09269d1bd9c9293bf8a" exitCode=0 Mar 18 16:22:03 crc kubenswrapper[4792]: I0318 16:22:03.337261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-p8crf" event={"ID":"82c0482d-5b4f-41f8-b10a-817c7f129804","Type":"ContainerDied","Data":"c8ee1e138e5b3d4c65f557e82fb155486fa6e7629c21c09269d1bd9c9293bf8a"} Mar 18 16:22:04 crc kubenswrapper[4792]: I0318 16:22:04.748809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-p8crf" Mar 18 16:22:04 crc kubenswrapper[4792]: I0318 16:22:04.876087 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf8nt\" (UniqueName: \"kubernetes.io/projected/82c0482d-5b4f-41f8-b10a-817c7f129804-kube-api-access-pf8nt\") pod \"82c0482d-5b4f-41f8-b10a-817c7f129804\" (UID: \"82c0482d-5b4f-41f8-b10a-817c7f129804\") " Mar 18 16:22:04 crc kubenswrapper[4792]: I0318 16:22:04.886237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c0482d-5b4f-41f8-b10a-817c7f129804-kube-api-access-pf8nt" (OuterVolumeSpecName: "kube-api-access-pf8nt") pod "82c0482d-5b4f-41f8-b10a-817c7f129804" (UID: "82c0482d-5b4f-41f8-b10a-817c7f129804"). InnerVolumeSpecName "kube-api-access-pf8nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:22:04 crc kubenswrapper[4792]: I0318 16:22:04.980027 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf8nt\" (UniqueName: \"kubernetes.io/projected/82c0482d-5b4f-41f8-b10a-817c7f129804-kube-api-access-pf8nt\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:05 crc kubenswrapper[4792]: I0318 16:22:05.371258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564182-p8crf" event={"ID":"82c0482d-5b4f-41f8-b10a-817c7f129804","Type":"ContainerDied","Data":"08ee6b135a44c08b5a3afc0ae82cbf0d774e6fbc21c82050965822b1f6e1fd49"} Mar 18 16:22:05 crc kubenswrapper[4792]: I0318 16:22:05.371324 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ee6b135a44c08b5a3afc0ae82cbf0d774e6fbc21c82050965822b1f6e1fd49" Mar 18 16:22:05 crc kubenswrapper[4792]: I0318 16:22:05.371415 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564182-p8crf" Mar 18 16:22:05 crc kubenswrapper[4792]: I0318 16:22:05.833828 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-dpsh9"] Mar 18 16:22:05 crc kubenswrapper[4792]: I0318 16:22:05.845930 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564176-dpsh9"] Mar 18 16:22:05 crc kubenswrapper[4792]: I0318 16:22:05.871988 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae37da08-4fe7-4848-9799-69148e19d197" path="/var/lib/kubelet/pods/ae37da08-4fe7-4848-9799-69148e19d197/volumes" Mar 18 16:22:07 crc kubenswrapper[4792]: E0318 16:22:07.002818 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:22:09 crc kubenswrapper[4792]: E0318 16:22:09.188254 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice/crio-498931d823f09c615a263bedd90a61c3c8420f5167d6f9ffb7bf406517bfa6b5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d558b7_3e50_46fb_b7b8_269c00392479.slice\": RecentStats: unable to find data in memory cache]" Mar 18 16:22:29 crc kubenswrapper[4792]: I0318 16:22:29.941625 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhhd"] Mar 18 16:22:29 crc kubenswrapper[4792]: E0318 16:22:29.942667 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c0482d-5b4f-41f8-b10a-817c7f129804" containerName="oc" Mar 18 16:22:29 crc kubenswrapper[4792]: I0318 16:22:29.942682 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c0482d-5b4f-41f8-b10a-817c7f129804" containerName="oc" Mar 18 16:22:29 crc kubenswrapper[4792]: I0318 16:22:29.942936 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c0482d-5b4f-41f8-b10a-817c7f129804" containerName="oc" Mar 18 16:22:29 crc kubenswrapper[4792]: I0318 16:22:29.944695 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:29 crc kubenswrapper[4792]: I0318 16:22:29.954277 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhhd"] Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.106924 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-utilities\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.107068 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trxq2\" (UniqueName: \"kubernetes.io/projected/a2ea931b-63c7-462c-b353-cf8cd36a5f59-kube-api-access-trxq2\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.107094 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-catalog-content\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.209502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-utilities\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.209616 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trxq2\" (UniqueName: \"kubernetes.io/projected/a2ea931b-63c7-462c-b353-cf8cd36a5f59-kube-api-access-trxq2\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.209658 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-catalog-content\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.210094 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-utilities\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.210309 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-catalog-content\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.230632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trxq2\" (UniqueName: \"kubernetes.io/projected/a2ea931b-63c7-462c-b353-cf8cd36a5f59-kube-api-access-trxq2\") pod \"redhat-marketplace-dnhhd\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.268084 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.321450 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.321526 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:22:30 crc kubenswrapper[4792]: I0318 16:22:30.759512 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhhd"] Mar 18 16:22:31 crc kubenswrapper[4792]: I0318 16:22:31.666151 4792 generic.go:334] "Generic (PLEG): container finished" podID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerID="9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b" exitCode=0 Mar 18 16:22:31 crc kubenswrapper[4792]: I0318 16:22:31.666317 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhhd" event={"ID":"a2ea931b-63c7-462c-b353-cf8cd36a5f59","Type":"ContainerDied","Data":"9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b"} Mar 18 16:22:31 crc kubenswrapper[4792]: I0318 16:22:31.666516 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhhd" event={"ID":"a2ea931b-63c7-462c-b353-cf8cd36a5f59","Type":"ContainerStarted","Data":"4a0dadc88fe5dd60ec114afc6e3b3565c551cffae046262b891fc87b83659e2c"} Mar 18 16:22:33 crc kubenswrapper[4792]: I0318 16:22:33.712436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhhd" event={"ID":"a2ea931b-63c7-462c-b353-cf8cd36a5f59","Type":"ContainerStarted","Data":"527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab"} Mar 18 16:22:34 crc kubenswrapper[4792]: I0318 16:22:34.723365 4792 generic.go:334] "Generic (PLEG): container finished" podID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerID="527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab" exitCode=0 Mar 18 16:22:34 crc kubenswrapper[4792]: I0318 16:22:34.723420 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhhd" event={"ID":"a2ea931b-63c7-462c-b353-cf8cd36a5f59","Type":"ContainerDied","Data":"527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab"} Mar 18 16:22:36 crc kubenswrapper[4792]: I0318 16:22:36.745984 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhhd" event={"ID":"a2ea931b-63c7-462c-b353-cf8cd36a5f59","Type":"ContainerStarted","Data":"07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20"} Mar 18 16:22:36 crc kubenswrapper[4792]: I0318 16:22:36.768545 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dnhhd" podStartSLOduration=3.891216237 podStartE2EDuration="7.768525579s" podCreationTimestamp="2026-03-18 16:22:29 +0000 UTC" firstStartedPulling="2026-03-18 16:22:31.66978615 +0000 UTC m=+2900.539115087" lastFinishedPulling="2026-03-18 16:22:35.547095492 +0000 UTC m=+2904.416424429" observedRunningTime="2026-03-18 16:22:36.761095956 +0000 UTC m=+2905.630424903" watchObservedRunningTime="2026-03-18 16:22:36.768525579 +0000 UTC m=+2905.637854516" Mar 18 16:22:40 crc kubenswrapper[4792]: I0318 16:22:40.269119 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:40 crc kubenswrapper[4792]: I0318 16:22:40.270386 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:40 crc kubenswrapper[4792]: I0318 16:22:40.328390 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:40 crc kubenswrapper[4792]: I0318 16:22:40.834937 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:40 crc kubenswrapper[4792]: I0318 16:22:40.902515 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhhd"] Mar 18 16:22:42 crc kubenswrapper[4792]: I0318 16:22:42.800832 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dnhhd" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="registry-server" containerID="cri-o://07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20" gracePeriod=2 Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.362195 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.376269 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-utilities\") pod \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.376557 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-catalog-content\") pod \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.376688 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trxq2\" (UniqueName: \"kubernetes.io/projected/a2ea931b-63c7-462c-b353-cf8cd36a5f59-kube-api-access-trxq2\") pod \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\" (UID: \"a2ea931b-63c7-462c-b353-cf8cd36a5f59\") " Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.377676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-utilities" (OuterVolumeSpecName: "utilities") pod "a2ea931b-63c7-462c-b353-cf8cd36a5f59" (UID: "a2ea931b-63c7-462c-b353-cf8cd36a5f59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.383720 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ea931b-63c7-462c-b353-cf8cd36a5f59-kube-api-access-trxq2" (OuterVolumeSpecName: "kube-api-access-trxq2") pod "a2ea931b-63c7-462c-b353-cf8cd36a5f59" (UID: "a2ea931b-63c7-462c-b353-cf8cd36a5f59"). InnerVolumeSpecName "kube-api-access-trxq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.479210 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trxq2\" (UniqueName: \"kubernetes.io/projected/a2ea931b-63c7-462c-b353-cf8cd36a5f59-kube-api-access-trxq2\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.479245 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.598564 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2ea931b-63c7-462c-b353-cf8cd36a5f59" (UID: "a2ea931b-63c7-462c-b353-cf8cd36a5f59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.682911 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ea931b-63c7-462c-b353-cf8cd36a5f59-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.815726 4792 generic.go:334] "Generic (PLEG): container finished" podID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerID="07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20" exitCode=0 Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.815778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhhd" event={"ID":"a2ea931b-63c7-462c-b353-cf8cd36a5f59","Type":"ContainerDied","Data":"07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20"} Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.815809 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhhd" event={"ID":"a2ea931b-63c7-462c-b353-cf8cd36a5f59","Type":"ContainerDied","Data":"4a0dadc88fe5dd60ec114afc6e3b3565c551cffae046262b891fc87b83659e2c"} Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.815832 4792 scope.go:117] "RemoveContainer" containerID="07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.816987 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnhhd" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.848219 4792 scope.go:117] "RemoveContainer" containerID="527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.881788 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhhd"] Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.883112 4792 scope.go:117] "RemoveContainer" containerID="9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.885857 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhhd"] Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.943653 4792 scope.go:117] "RemoveContainer" containerID="07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20" Mar 18 16:22:43 crc kubenswrapper[4792]: E0318 16:22:43.947085 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20\": container with ID starting with 07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20 not found: ID does not exist" containerID="07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.947126 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20"} err="failed to get container status \"07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20\": rpc error: code = NotFound desc = could not find container \"07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20\": container with ID starting with 07bb0adb7b63c56bfdea2b15260dc7b41b58a7dfb66b7106de0c1bdb1c1aae20 not found: ID does not exist" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.947150 4792 scope.go:117] "RemoveContainer" containerID="527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab" Mar 18 16:22:43 crc kubenswrapper[4792]: E0318 16:22:43.947649 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab\": container with ID starting with 527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab not found: ID does not exist" containerID="527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.947671 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab"} err="failed to get container status \"527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab\": rpc error: code = NotFound desc = could not find container \"527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab\": container with ID starting with 527f5b9c513830fc676ac463a88d49eca08f46a2c959142ab477c15fcd52f6ab not found: ID does not exist" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.947684 4792 scope.go:117] "RemoveContainer" containerID="9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b" Mar 18 16:22:43 crc kubenswrapper[4792]: E0318 16:22:43.947990 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b\": container with ID starting with 9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b not found: ID does not exist" containerID="9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b" Mar 18 16:22:43 crc kubenswrapper[4792]: I0318 16:22:43.948015 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b"} err="failed to get container status \"9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b\": rpc error: code = NotFound desc = could not find container \"9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b\": container with ID starting with 9297e9e934efbbaa50d9c7e3c040ffdcef350b1fce82da165e60b6cbda8b790b not found: ID does not exist" Mar 18 16:22:45 crc kubenswrapper[4792]: I0318 16:22:45.869051 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" path="/var/lib/kubelet/pods/a2ea931b-63c7-462c-b353-cf8cd36a5f59/volumes" Mar 18 16:22:49 crc kubenswrapper[4792]: I0318 16:22:49.602066 4792 scope.go:117] "RemoveContainer" containerID="8b0c22a07a4d35035ec51d45a28296c01f7521d5f4d6e0a7171875022fb3462c" Mar 18 16:23:00 crc kubenswrapper[4792]: I0318 16:23:00.321685 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:23:00 crc kubenswrapper[4792]: I0318 16:23:00.322475 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:23:30 crc kubenswrapper[4792]: I0318 16:23:30.322138 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:23:30 crc kubenswrapper[4792]: I0318 16:23:30.322654 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:23:30 crc kubenswrapper[4792]: I0318 16:23:30.322706 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:23:30 crc kubenswrapper[4792]: I0318 16:23:30.323653 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:23:30 crc kubenswrapper[4792]: I0318 16:23:30.323720 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" gracePeriod=600 Mar 18 16:23:30 crc kubenswrapper[4792]: E0318 16:23:30.949187 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:23:31 crc kubenswrapper[4792]: I0318 16:23:31.348597 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" exitCode=0 Mar 18 16:23:31 crc kubenswrapper[4792]: I0318 16:23:31.348654 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247"} Mar 18 16:23:31 crc kubenswrapper[4792]: I0318 16:23:31.348699 4792 scope.go:117] "RemoveContainer" containerID="919815f8e807458116e4782122ae529e92a15bfdba2eb450ded44df40389711d" Mar 18 16:23:31 crc kubenswrapper[4792]: I0318 16:23:31.349538 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:23:31 crc kubenswrapper[4792]: E0318 16:23:31.349856 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:23:45 crc kubenswrapper[4792]: I0318 16:23:45.855588 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:23:45 crc kubenswrapper[4792]: E0318 16:23:45.856360 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:23:47 crc kubenswrapper[4792]: I0318 16:23:47.511631 4792 generic.go:334] "Generic (PLEG): container finished" podID="c675fc82-9ad3-4eae-8918-00ab1d6fd06d" containerID="84384f49e3bc0b808a8909cc10cf6cce23708c405c529251ca5d08594b412097" exitCode=0 Mar 18 16:23:47 crc kubenswrapper[4792]: I0318 16:23:47.511713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" event={"ID":"c675fc82-9ad3-4eae-8918-00ab1d6fd06d","Type":"ContainerDied","Data":"84384f49e3bc0b808a8909cc10cf6cce23708c405c529251ca5d08594b412097"} Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.036441 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.194063 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfwjr\" (UniqueName: \"kubernetes.io/projected/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-kube-api-access-lfwjr\") pod \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.194146 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ssh-key-openstack-edpm-ipam\") pod \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.194224 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-1\") pod \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.194313 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-0\") pod \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.194403 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-telemetry-combined-ca-bundle\") pod \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.194577 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-inventory\") pod \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.194627 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-2\") pod \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\" (UID: \"c675fc82-9ad3-4eae-8918-00ab1d6fd06d\") " Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.199771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-kube-api-access-lfwjr" (OuterVolumeSpecName: "kube-api-access-lfwjr") pod "c675fc82-9ad3-4eae-8918-00ab1d6fd06d" (UID: "c675fc82-9ad3-4eae-8918-00ab1d6fd06d"). InnerVolumeSpecName "kube-api-access-lfwjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.212952 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c675fc82-9ad3-4eae-8918-00ab1d6fd06d" (UID: "c675fc82-9ad3-4eae-8918-00ab1d6fd06d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.231316 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c675fc82-9ad3-4eae-8918-00ab1d6fd06d" (UID: "c675fc82-9ad3-4eae-8918-00ab1d6fd06d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.231379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "c675fc82-9ad3-4eae-8918-00ab1d6fd06d" (UID: "c675fc82-9ad3-4eae-8918-00ab1d6fd06d"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.232930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "c675fc82-9ad3-4eae-8918-00ab1d6fd06d" (UID: "c675fc82-9ad3-4eae-8918-00ab1d6fd06d"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.237998 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "c675fc82-9ad3-4eae-8918-00ab1d6fd06d" (UID: "c675fc82-9ad3-4eae-8918-00ab1d6fd06d"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.243997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-inventory" (OuterVolumeSpecName: "inventory") pod "c675fc82-9ad3-4eae-8918-00ab1d6fd06d" (UID: "c675fc82-9ad3-4eae-8918-00ab1d6fd06d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.297702 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.298042 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.298081 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfwjr\" (UniqueName: \"kubernetes.io/projected/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-kube-api-access-lfwjr\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.298096 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.298123 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.299033 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.299080 4792 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c675fc82-9ad3-4eae-8918-00ab1d6fd06d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.534469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" event={"ID":"c675fc82-9ad3-4eae-8918-00ab1d6fd06d","Type":"ContainerDied","Data":"364aea36fcc3066d844ca2a310299d2cc649f644e2b917e7f631ab09d58ca453"} Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.534517 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="364aea36fcc3066d844ca2a310299d2cc649f644e2b917e7f631ab09d58ca453" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.534546 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.647218 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt"] Mar 18 16:23:49 crc kubenswrapper[4792]: E0318 16:23:49.647759 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="extract-utilities" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.647777 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="extract-utilities" Mar 18 16:23:49 crc kubenswrapper[4792]: E0318 16:23:49.647802 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="registry-server" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.647809 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="registry-server" Mar 18 16:23:49 crc kubenswrapper[4792]: E0318 16:23:49.647823 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c675fc82-9ad3-4eae-8918-00ab1d6fd06d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.647830 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c675fc82-9ad3-4eae-8918-00ab1d6fd06d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 16:23:49 crc kubenswrapper[4792]: E0318 16:23:49.647856 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="extract-content" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.647864 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="extract-content" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.648120 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c675fc82-9ad3-4eae-8918-00ab1d6fd06d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.648136 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ea931b-63c7-462c-b353-cf8cd36a5f59" containerName="registry-server" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.649140 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.652417 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.652685 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.652811 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.653076 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.653808 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.659398 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt"] Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.810728 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.811323 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.811383 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.811445 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.811598 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.811649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz8tz\" (UniqueName: \"kubernetes.io/projected/416434fb-7fe4-4872-9d1e-8bb317a4eab1-kube-api-access-rz8tz\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.811746 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.914453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.914548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.914606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.914676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.914700 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz8tz\" (UniqueName: \"kubernetes.io/projected/416434fb-7fe4-4872-9d1e-8bb317a4eab1-kube-api-access-rz8tz\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.914854 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.915811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.919952 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.919941 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.920764 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.921092 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.921687 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.924185 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.932613 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz8tz\" (UniqueName: \"kubernetes.io/projected/416434fb-7fe4-4872-9d1e-8bb317a4eab1-kube-api-access-rz8tz\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:49 crc kubenswrapper[4792]: I0318 16:23:49.970094 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:23:50 crc kubenswrapper[4792]: I0318 16:23:50.580690 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt"] Mar 18 16:23:50 crc kubenswrapper[4792]: I0318 16:23:50.596511 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:23:51 crc kubenswrapper[4792]: I0318 16:23:51.596691 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" event={"ID":"416434fb-7fe4-4872-9d1e-8bb317a4eab1","Type":"ContainerStarted","Data":"8e48bfc2afbf266d53e838adb812c031a454c5200fc548b9dfed9f0fa04f1e89"} Mar 18 16:23:52 crc kubenswrapper[4792]: I0318 16:23:52.608857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" event={"ID":"416434fb-7fe4-4872-9d1e-8bb317a4eab1","Type":"ContainerStarted","Data":"b69302cb3c29d7d5863b33158a95b266804dcb013a8ec268fbdb66032b0db957"} Mar 18 16:23:52 crc kubenswrapper[4792]: I0318 16:23:52.638156 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" podStartSLOduration=2.916742947 podStartE2EDuration="3.638138613s" podCreationTimestamp="2026-03-18 16:23:49 +0000 UTC" firstStartedPulling="2026-03-18 16:23:50.596292893 +0000 UTC m=+2979.465621830" lastFinishedPulling="2026-03-18 16:23:51.317688559 +0000 UTC m=+2980.187017496" observedRunningTime="2026-03-18 16:23:52.632873197 +0000 UTC m=+2981.502202134" watchObservedRunningTime="2026-03-18 16:23:52.638138613 +0000 UTC m=+2981.507467550" Mar 18 16:23:59 crc kubenswrapper[4792]: I0318 16:23:59.856282 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:23:59 crc kubenswrapper[4792]: E0318 16:23:59.857120 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.146234 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564184-fnkcq"] Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.148262 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-fnkcq" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.150942 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.151238 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.151654 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.158023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-fnkcq"] Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.204234 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjsl\" (UniqueName: \"kubernetes.io/projected/6c9fb57e-2f97-4c16-954f-a4dc3276dd58-kube-api-access-xcjsl\") pod \"auto-csr-approver-29564184-fnkcq\" (UID: \"6c9fb57e-2f97-4c16-954f-a4dc3276dd58\") " pod="openshift-infra/auto-csr-approver-29564184-fnkcq" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.306607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjsl\" (UniqueName: \"kubernetes.io/projected/6c9fb57e-2f97-4c16-954f-a4dc3276dd58-kube-api-access-xcjsl\") pod \"auto-csr-approver-29564184-fnkcq\" (UID: \"6c9fb57e-2f97-4c16-954f-a4dc3276dd58\") " pod="openshift-infra/auto-csr-approver-29564184-fnkcq" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.330399 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjsl\" (UniqueName: \"kubernetes.io/projected/6c9fb57e-2f97-4c16-954f-a4dc3276dd58-kube-api-access-xcjsl\") pod \"auto-csr-approver-29564184-fnkcq\" (UID: \"6c9fb57e-2f97-4c16-954f-a4dc3276dd58\") " pod="openshift-infra/auto-csr-approver-29564184-fnkcq" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.474545 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-fnkcq" Mar 18 16:24:00 crc kubenswrapper[4792]: I0318 16:24:00.977387 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-fnkcq"] Mar 18 16:24:00 crc kubenswrapper[4792]: W0318 16:24:00.978166 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c9fb57e_2f97_4c16_954f_a4dc3276dd58.slice/crio-6eefc2a1b317618af603ccb14d3a25530bf164537613ea4ebcd4685579ce984b WatchSource:0}: Error finding container 6eefc2a1b317618af603ccb14d3a25530bf164537613ea4ebcd4685579ce984b: Status 404 returned error can't find the container with id 6eefc2a1b317618af603ccb14d3a25530bf164537613ea4ebcd4685579ce984b Mar 18 16:24:01 crc kubenswrapper[4792]: I0318 16:24:01.703273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-fnkcq" event={"ID":"6c9fb57e-2f97-4c16-954f-a4dc3276dd58","Type":"ContainerStarted","Data":"6eefc2a1b317618af603ccb14d3a25530bf164537613ea4ebcd4685579ce984b"} Mar 18 16:24:02 crc kubenswrapper[4792]: I0318 16:24:02.724664 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c9fb57e-2f97-4c16-954f-a4dc3276dd58" containerID="3263e5b19059da6522a66b3c91147b8f9ac5e6ca77078eaec1621dcdc2ba4f69" exitCode=0 Mar 18 16:24:02 crc kubenswrapper[4792]: I0318 16:24:02.725020 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-fnkcq" event={"ID":"6c9fb57e-2f97-4c16-954f-a4dc3276dd58","Type":"ContainerDied","Data":"3263e5b19059da6522a66b3c91147b8f9ac5e6ca77078eaec1621dcdc2ba4f69"} Mar 18 16:24:04 crc kubenswrapper[4792]: I0318 16:24:04.178847 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-fnkcq" Mar 18 16:24:04 crc kubenswrapper[4792]: I0318 16:24:04.215598 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcjsl\" (UniqueName: \"kubernetes.io/projected/6c9fb57e-2f97-4c16-954f-a4dc3276dd58-kube-api-access-xcjsl\") pod \"6c9fb57e-2f97-4c16-954f-a4dc3276dd58\" (UID: \"6c9fb57e-2f97-4c16-954f-a4dc3276dd58\") " Mar 18 16:24:04 crc kubenswrapper[4792]: I0318 16:24:04.226171 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9fb57e-2f97-4c16-954f-a4dc3276dd58-kube-api-access-xcjsl" (OuterVolumeSpecName: "kube-api-access-xcjsl") pod "6c9fb57e-2f97-4c16-954f-a4dc3276dd58" (UID: "6c9fb57e-2f97-4c16-954f-a4dc3276dd58"). InnerVolumeSpecName "kube-api-access-xcjsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:24:04 crc kubenswrapper[4792]: I0318 16:24:04.319841 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcjsl\" (UniqueName: \"kubernetes.io/projected/6c9fb57e-2f97-4c16-954f-a4dc3276dd58-kube-api-access-xcjsl\") on node \"crc\" DevicePath \"\"" Mar 18 16:24:04 crc kubenswrapper[4792]: I0318 16:24:04.752369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564184-fnkcq" event={"ID":"6c9fb57e-2f97-4c16-954f-a4dc3276dd58","Type":"ContainerDied","Data":"6eefc2a1b317618af603ccb14d3a25530bf164537613ea4ebcd4685579ce984b"} Mar 18 16:24:04 crc kubenswrapper[4792]: I0318 16:24:04.752428 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eefc2a1b317618af603ccb14d3a25530bf164537613ea4ebcd4685579ce984b" Mar 18 16:24:04 crc kubenswrapper[4792]: I0318 16:24:04.752396 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564184-fnkcq" Mar 18 16:24:05 crc kubenswrapper[4792]: I0318 16:24:05.262770 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-th78d"] Mar 18 16:24:05 crc kubenswrapper[4792]: I0318 16:24:05.274017 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564178-th78d"] Mar 18 16:24:05 crc kubenswrapper[4792]: I0318 16:24:05.866770 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2" path="/var/lib/kubelet/pods/e62f1ae4-f4e2-45cc-a4d3-fab7ac5109a2/volumes" Mar 18 16:24:14 crc kubenswrapper[4792]: I0318 16:24:14.854737 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:24:14 crc kubenswrapper[4792]: E0318 16:24:14.855758 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:24:26 crc kubenswrapper[4792]: I0318 16:24:26.854132 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:24:26 crc kubenswrapper[4792]: E0318 16:24:26.855075 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:24:39 crc kubenswrapper[4792]: I0318 16:24:39.854748 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:24:39 crc kubenswrapper[4792]: E0318 16:24:39.855595 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:24:49 crc kubenswrapper[4792]: I0318 16:24:49.736949 4792 scope.go:117] "RemoveContainer" containerID="e3068cf8f827d98403e8bc91f9452f3159beca1241b4bd6085fc544a45e34c04" Mar 18 16:24:54 crc kubenswrapper[4792]: I0318 16:24:54.856051 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:24:54 crc kubenswrapper[4792]: E0318 16:24:54.856947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.714214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-krqdl"] Mar 18 16:25:02 crc kubenswrapper[4792]: E0318 16:25:02.719773 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9fb57e-2f97-4c16-954f-a4dc3276dd58" containerName="oc" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.719812 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9fb57e-2f97-4c16-954f-a4dc3276dd58" containerName="oc" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.724522 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9fb57e-2f97-4c16-954f-a4dc3276dd58" containerName="oc" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.731058 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.753008 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krqdl"] Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.874132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxlw6\" (UniqueName: \"kubernetes.io/projected/c8c8b123-83ce-4087-a5f3-3ba212301f4c-kube-api-access-kxlw6\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.874201 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-utilities\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.874678 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-catalog-content\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.977269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-catalog-content\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.977467 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxlw6\" (UniqueName: \"kubernetes.io/projected/c8c8b123-83ce-4087-a5f3-3ba212301f4c-kube-api-access-kxlw6\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.977511 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-utilities\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.978122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-utilities\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:02 crc kubenswrapper[4792]: I0318 16:25:02.978398 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-catalog-content\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.043264 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxlw6\" (UniqueName: \"kubernetes.io/projected/c8c8b123-83ce-4087-a5f3-3ba212301f4c-kube-api-access-kxlw6\") pod \"community-operators-krqdl\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.068273 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.498642 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9c2tw"] Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.502281 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.512608 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9c2tw"] Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.594213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-catalog-content\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.594295 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-utilities\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.594412 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv429\" (UniqueName: \"kubernetes.io/projected/1441a41f-e669-4930-8f58-b0c638824121-kube-api-access-mv429\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.677688 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krqdl"] Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.697395 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv429\" (UniqueName: \"kubernetes.io/projected/1441a41f-e669-4930-8f58-b0c638824121-kube-api-access-mv429\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.697907 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-catalog-content\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.698038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-utilities\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.698432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-catalog-content\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.699031 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-utilities\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.724866 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv429\" (UniqueName: \"kubernetes.io/projected/1441a41f-e669-4930-8f58-b0c638824121-kube-api-access-mv429\") pod \"certified-operators-9c2tw\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:03 crc kubenswrapper[4792]: I0318 16:25:03.827080 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:04 crc kubenswrapper[4792]: I0318 16:25:04.426535 4792 generic.go:334] "Generic (PLEG): container finished" podID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerID="82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087" exitCode=0 Mar 18 16:25:04 crc kubenswrapper[4792]: I0318 16:25:04.426581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqdl" event={"ID":"c8c8b123-83ce-4087-a5f3-3ba212301f4c","Type":"ContainerDied","Data":"82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087"} Mar 18 16:25:04 crc kubenswrapper[4792]: I0318 16:25:04.427395 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqdl" event={"ID":"c8c8b123-83ce-4087-a5f3-3ba212301f4c","Type":"ContainerStarted","Data":"d2edba70690f8dc93a78ae5e53408283d9064b1a94335f563ef3cca4daad98b8"} Mar 18 16:25:04 crc kubenswrapper[4792]: I0318 16:25:04.451653 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9c2tw"] Mar 18 16:25:04 crc kubenswrapper[4792]: W0318 16:25:04.457311 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1441a41f_e669_4930_8f58_b0c638824121.slice/crio-1f6c525d1e08893e140a8bc11afb90ebe4e9bc212c9ea33c49cf3be48cc286d5 WatchSource:0}: Error finding container 1f6c525d1e08893e140a8bc11afb90ebe4e9bc212c9ea33c49cf3be48cc286d5: Status 404 returned error can't find the container with id 1f6c525d1e08893e140a8bc11afb90ebe4e9bc212c9ea33c49cf3be48cc286d5 Mar 18 16:25:05 crc kubenswrapper[4792]: I0318 16:25:05.438634 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqdl" event={"ID":"c8c8b123-83ce-4087-a5f3-3ba212301f4c","Type":"ContainerStarted","Data":"80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9"} Mar 18 16:25:05 crc kubenswrapper[4792]: I0318 16:25:05.440139 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c2tw" event={"ID":"1441a41f-e669-4930-8f58-b0c638824121","Type":"ContainerDied","Data":"cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699"} Mar 18 16:25:05 crc kubenswrapper[4792]: I0318 16:25:05.440046 4792 generic.go:334] "Generic (PLEG): container finished" podID="1441a41f-e669-4930-8f58-b0c638824121" containerID="cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699" exitCode=0 Mar 18 16:25:05 crc kubenswrapper[4792]: I0318 16:25:05.440229 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c2tw" event={"ID":"1441a41f-e669-4930-8f58-b0c638824121","Type":"ContainerStarted","Data":"1f6c525d1e08893e140a8bc11afb90ebe4e9bc212c9ea33c49cf3be48cc286d5"} Mar 18 16:25:06 crc kubenswrapper[4792]: I0318 16:25:06.462491 4792 generic.go:334] "Generic (PLEG): container finished" podID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerID="80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9" exitCode=0 Mar 18 16:25:06 crc kubenswrapper[4792]: I0318 16:25:06.462900 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqdl" event={"ID":"c8c8b123-83ce-4087-a5f3-3ba212301f4c","Type":"ContainerDied","Data":"80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9"} Mar 18 16:25:07 crc kubenswrapper[4792]: I0318 16:25:07.475538 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqdl" event={"ID":"c8c8b123-83ce-4087-a5f3-3ba212301f4c","Type":"ContainerStarted","Data":"a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f"} Mar 18 16:25:07 crc kubenswrapper[4792]: I0318 16:25:07.478636 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c2tw" event={"ID":"1441a41f-e669-4930-8f58-b0c638824121","Type":"ContainerStarted","Data":"b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e"} Mar 18 16:25:07 crc kubenswrapper[4792]: I0318 16:25:07.502042 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-krqdl" podStartSLOduration=2.879032344 podStartE2EDuration="5.502024159s" podCreationTimestamp="2026-03-18 16:25:02 +0000 UTC" firstStartedPulling="2026-03-18 16:25:04.430677255 +0000 UTC m=+3053.300006232" lastFinishedPulling="2026-03-18 16:25:07.05366909 +0000 UTC m=+3055.922998047" observedRunningTime="2026-03-18 16:25:07.491455627 +0000 UTC m=+3056.360784564" watchObservedRunningTime="2026-03-18 16:25:07.502024159 +0000 UTC m=+3056.371353096" Mar 18 16:25:07 crc kubenswrapper[4792]: I0318 16:25:07.855583 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:25:07 crc kubenswrapper[4792]: E0318 16:25:07.856089 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:25:10 crc kubenswrapper[4792]: I0318 16:25:10.510575 4792 generic.go:334] "Generic (PLEG): container finished" podID="1441a41f-e669-4930-8f58-b0c638824121" containerID="b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e" exitCode=0 Mar 18 16:25:10 crc kubenswrapper[4792]: I0318 16:25:10.510649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c2tw" event={"ID":"1441a41f-e669-4930-8f58-b0c638824121","Type":"ContainerDied","Data":"b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e"} Mar 18 16:25:11 crc kubenswrapper[4792]: I0318 16:25:11.523531 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c2tw" event={"ID":"1441a41f-e669-4930-8f58-b0c638824121","Type":"ContainerStarted","Data":"c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c"} Mar 18 16:25:11 crc kubenswrapper[4792]: I0318 16:25:11.553644 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9c2tw" podStartSLOduration=3.093542994 podStartE2EDuration="8.553624991s" podCreationTimestamp="2026-03-18 16:25:03 +0000 UTC" firstStartedPulling="2026-03-18 16:25:05.441352529 +0000 UTC m=+3054.310681466" lastFinishedPulling="2026-03-18 16:25:10.901434526 +0000 UTC m=+3059.770763463" observedRunningTime="2026-03-18 16:25:11.542686458 +0000 UTC m=+3060.412015395" watchObservedRunningTime="2026-03-18 16:25:11.553624991 +0000 UTC m=+3060.422953928" Mar 18 16:25:13 crc kubenswrapper[4792]: I0318 16:25:13.069456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:13 crc kubenswrapper[4792]: I0318 16:25:13.069803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:13 crc kubenswrapper[4792]: I0318 16:25:13.131634 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:13 crc kubenswrapper[4792]: I0318 16:25:13.600412 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:13 crc kubenswrapper[4792]: I0318 16:25:13.827382 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:13 crc kubenswrapper[4792]: I0318 16:25:13.827419 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:13 crc kubenswrapper[4792]: I0318 16:25:13.881143 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:14 crc kubenswrapper[4792]: I0318 16:25:14.494781 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krqdl"] Mar 18 16:25:15 crc kubenswrapper[4792]: I0318 16:25:15.564410 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-krqdl" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="registry-server" containerID="cri-o://a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f" gracePeriod=2 Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.041198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.127460 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxlw6\" (UniqueName: \"kubernetes.io/projected/c8c8b123-83ce-4087-a5f3-3ba212301f4c-kube-api-access-kxlw6\") pod \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.127588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-catalog-content\") pod \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.127744 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-utilities\") pod \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\" (UID: \"c8c8b123-83ce-4087-a5f3-3ba212301f4c\") " Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.128616 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-utilities" (OuterVolumeSpecName: "utilities") pod "c8c8b123-83ce-4087-a5f3-3ba212301f4c" (UID: "c8c8b123-83ce-4087-a5f3-3ba212301f4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.136453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c8b123-83ce-4087-a5f3-3ba212301f4c-kube-api-access-kxlw6" (OuterVolumeSpecName: "kube-api-access-kxlw6") pod "c8c8b123-83ce-4087-a5f3-3ba212301f4c" (UID: "c8c8b123-83ce-4087-a5f3-3ba212301f4c"). InnerVolumeSpecName "kube-api-access-kxlw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.174545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8c8b123-83ce-4087-a5f3-3ba212301f4c" (UID: "c8c8b123-83ce-4087-a5f3-3ba212301f4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.230897 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxlw6\" (UniqueName: \"kubernetes.io/projected/c8c8b123-83ce-4087-a5f3-3ba212301f4c-kube-api-access-kxlw6\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.230943 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.230955 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8b123-83ce-4087-a5f3-3ba212301f4c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.583896 4792 generic.go:334] "Generic (PLEG): container finished" podID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerID="a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f" exitCode=0 Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.583986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqdl" event={"ID":"c8c8b123-83ce-4087-a5f3-3ba212301f4c","Type":"ContainerDied","Data":"a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f"} Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.584015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqdl" event={"ID":"c8c8b123-83ce-4087-a5f3-3ba212301f4c","Type":"ContainerDied","Data":"d2edba70690f8dc93a78ae5e53408283d9064b1a94335f563ef3cca4daad98b8"} Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.584082 4792 scope.go:117] "RemoveContainer" containerID="a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.584306 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krqdl" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.610450 4792 scope.go:117] "RemoveContainer" containerID="80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.630286 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krqdl"] Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.642270 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-krqdl"] Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.673414 4792 scope.go:117] "RemoveContainer" containerID="82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.721435 4792 scope.go:117] "RemoveContainer" containerID="a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f" Mar 18 16:25:16 crc kubenswrapper[4792]: E0318 16:25:16.722156 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f\": container with ID starting with a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f not found: ID does not exist" containerID="a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.722196 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f"} err="failed to get container status \"a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f\": rpc error: code = NotFound desc = could not find container \"a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f\": container with ID starting with a9a39e9b088658ef1c90da8f2f033f8142657297d8c30654b87eda6ea0445b1f not found: ID does not exist" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.722217 4792 scope.go:117] "RemoveContainer" containerID="80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9" Mar 18 16:25:16 crc kubenswrapper[4792]: E0318 16:25:16.722507 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9\": container with ID starting with 80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9 not found: ID does not exist" containerID="80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.722535 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9"} err="failed to get container status \"80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9\": rpc error: code = NotFound desc = could not find container \"80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9\": container with ID starting with 80091661d182385d4514d455e418425d69ab3a1f5060029af302250f55fd11b9 not found: ID does not exist" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.722550 4792 scope.go:117] "RemoveContainer" containerID="82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087" Mar 18 16:25:16 crc kubenswrapper[4792]: E0318 16:25:16.722849 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087\": container with ID starting with 82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087 not found: ID does not exist" containerID="82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087" Mar 18 16:25:16 crc kubenswrapper[4792]: I0318 16:25:16.722873 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087"} err="failed to get container status \"82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087\": rpc error: code = NotFound desc = could not find container \"82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087\": container with ID starting with 82c85d6ca4ee3e64a8d140fa1b2dea37a03a726ed55008d4d4fc125a86dde087 not found: ID does not exist" Mar 18 16:25:17 crc kubenswrapper[4792]: I0318 16:25:17.868311 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" path="/var/lib/kubelet/pods/c8c8b123-83ce-4087-a5f3-3ba212301f4c/volumes" Mar 18 16:25:20 crc kubenswrapper[4792]: I0318 16:25:20.854870 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:25:20 crc kubenswrapper[4792]: E0318 16:25:20.855762 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:25:23 crc kubenswrapper[4792]: I0318 16:25:23.887063 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:23 crc kubenswrapper[4792]: I0318 16:25:23.940117 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9c2tw"] Mar 18 16:25:24 crc kubenswrapper[4792]: I0318 16:25:24.681099 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9c2tw" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="registry-server" containerID="cri-o://c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c" gracePeriod=2 Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.218782 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.253512 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv429\" (UniqueName: \"kubernetes.io/projected/1441a41f-e669-4930-8f58-b0c638824121-kube-api-access-mv429\") pod \"1441a41f-e669-4930-8f58-b0c638824121\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.253576 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-utilities\") pod \"1441a41f-e669-4930-8f58-b0c638824121\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.253730 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-catalog-content\") pod \"1441a41f-e669-4930-8f58-b0c638824121\" (UID: \"1441a41f-e669-4930-8f58-b0c638824121\") " Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.254839 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-utilities" (OuterVolumeSpecName: "utilities") pod "1441a41f-e669-4930-8f58-b0c638824121" (UID: "1441a41f-e669-4930-8f58-b0c638824121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.259985 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.260685 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1441a41f-e669-4930-8f58-b0c638824121-kube-api-access-mv429" (OuterVolumeSpecName: "kube-api-access-mv429") pod "1441a41f-e669-4930-8f58-b0c638824121" (UID: "1441a41f-e669-4930-8f58-b0c638824121"). InnerVolumeSpecName "kube-api-access-mv429". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.317310 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1441a41f-e669-4930-8f58-b0c638824121" (UID: "1441a41f-e669-4930-8f58-b0c638824121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.364633 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1441a41f-e669-4930-8f58-b0c638824121-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.364688 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv429\" (UniqueName: \"kubernetes.io/projected/1441a41f-e669-4930-8f58-b0c638824121-kube-api-access-mv429\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.694470 4792 generic.go:334] "Generic (PLEG): container finished" podID="1441a41f-e669-4930-8f58-b0c638824121" containerID="c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c" exitCode=0 Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.694514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c2tw" event={"ID":"1441a41f-e669-4930-8f58-b0c638824121","Type":"ContainerDied","Data":"c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c"} Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.694542 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c2tw" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.694579 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c2tw" event={"ID":"1441a41f-e669-4930-8f58-b0c638824121","Type":"ContainerDied","Data":"1f6c525d1e08893e140a8bc11afb90ebe4e9bc212c9ea33c49cf3be48cc286d5"} Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.694606 4792 scope.go:117] "RemoveContainer" containerID="c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.717870 4792 scope.go:117] "RemoveContainer" containerID="b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.741173 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9c2tw"] Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.752244 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9c2tw"] Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.766198 4792 scope.go:117] "RemoveContainer" containerID="cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.808650 4792 scope.go:117] "RemoveContainer" containerID="c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c" Mar 18 16:25:25 crc kubenswrapper[4792]: E0318 16:25:25.809185 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c\": container with ID starting with c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c not found: ID does not exist" containerID="c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.809229 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c"} err="failed to get container status \"c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c\": rpc error: code = NotFound desc = could not find container \"c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c\": container with ID starting with c2c58d63a588503b0aad9bfa8455aab158266cc2f22fa9d7df25b27ccae48b1c not found: ID does not exist" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.809249 4792 scope.go:117] "RemoveContainer" containerID="b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e" Mar 18 16:25:25 crc kubenswrapper[4792]: E0318 16:25:25.809651 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e\": container with ID starting with b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e not found: ID does not exist" containerID="b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.809689 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e"} err="failed to get container status \"b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e\": rpc error: code = NotFound desc = could not find container \"b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e\": container with ID starting with b6a72f9cfc305f43144c7a327dd97f85faf5814bfdf337e4bfce2c1d49bf4d5e not found: ID does not exist" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.809703 4792 scope.go:117] "RemoveContainer" containerID="cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699" Mar 18 16:25:25 crc kubenswrapper[4792]: E0318 16:25:25.809885 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699\": container with ID starting with cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699 not found: ID does not exist" containerID="cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.809919 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699"} err="failed to get container status \"cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699\": rpc error: code = NotFound desc = could not find container \"cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699\": container with ID starting with cbbcdeb251aa2dd0b66d6b0f47f96eea8031060c381a116d24e2abd9deede699 not found: ID does not exist" Mar 18 16:25:25 crc kubenswrapper[4792]: I0318 16:25:25.867877 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1441a41f-e669-4930-8f58-b0c638824121" path="/var/lib/kubelet/pods/1441a41f-e669-4930-8f58-b0c638824121/volumes" Mar 18 16:25:35 crc kubenswrapper[4792]: I0318 16:25:35.855516 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:25:35 crc kubenswrapper[4792]: E0318 16:25:35.856373 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:25:36 crc kubenswrapper[4792]: I0318 16:25:36.815360 4792 generic.go:334] "Generic (PLEG): container finished" podID="416434fb-7fe4-4872-9d1e-8bb317a4eab1" containerID="b69302cb3c29d7d5863b33158a95b266804dcb013a8ec268fbdb66032b0db957" exitCode=0 Mar 18 16:25:36 crc kubenswrapper[4792]: I0318 16:25:36.815462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" event={"ID":"416434fb-7fe4-4872-9d1e-8bb317a4eab1","Type":"ContainerDied","Data":"b69302cb3c29d7d5863b33158a95b266804dcb013a8ec268fbdb66032b0db957"} Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.333047 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.395331 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-telemetry-power-monitoring-combined-ca-bundle\") pod \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.395738 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-inventory\") pod \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.395832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-0\") pod \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.395926 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-2\") pod \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.396143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-1\") pod \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.396323 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ssh-key-openstack-edpm-ipam\") pod \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.396437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz8tz\" (UniqueName: \"kubernetes.io/projected/416434fb-7fe4-4872-9d1e-8bb317a4eab1-kube-api-access-rz8tz\") pod \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\" (UID: \"416434fb-7fe4-4872-9d1e-8bb317a4eab1\") " Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.410122 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "416434fb-7fe4-4872-9d1e-8bb317a4eab1" (UID: "416434fb-7fe4-4872-9d1e-8bb317a4eab1"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.416181 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416434fb-7fe4-4872-9d1e-8bb317a4eab1-kube-api-access-rz8tz" (OuterVolumeSpecName: "kube-api-access-rz8tz") pod "416434fb-7fe4-4872-9d1e-8bb317a4eab1" (UID: "416434fb-7fe4-4872-9d1e-8bb317a4eab1"). InnerVolumeSpecName "kube-api-access-rz8tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.435897 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "416434fb-7fe4-4872-9d1e-8bb317a4eab1" (UID: "416434fb-7fe4-4872-9d1e-8bb317a4eab1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.436605 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "416434fb-7fe4-4872-9d1e-8bb317a4eab1" (UID: "416434fb-7fe4-4872-9d1e-8bb317a4eab1"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.437393 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-inventory" (OuterVolumeSpecName: "inventory") pod "416434fb-7fe4-4872-9d1e-8bb317a4eab1" (UID: "416434fb-7fe4-4872-9d1e-8bb317a4eab1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.437888 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "416434fb-7fe4-4872-9d1e-8bb317a4eab1" (UID: "416434fb-7fe4-4872-9d1e-8bb317a4eab1"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.440119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "416434fb-7fe4-4872-9d1e-8bb317a4eab1" (UID: "416434fb-7fe4-4872-9d1e-8bb317a4eab1"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.499235 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz8tz\" (UniqueName: \"kubernetes.io/projected/416434fb-7fe4-4872-9d1e-8bb317a4eab1-kube-api-access-rz8tz\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.499292 4792 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.499309 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.499321 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.499335 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.499351 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.499363 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416434fb-7fe4-4872-9d1e-8bb317a4eab1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.839937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" event={"ID":"416434fb-7fe4-4872-9d1e-8bb317a4eab1","Type":"ContainerDied","Data":"8e48bfc2afbf266d53e838adb812c031a454c5200fc548b9dfed9f0fa04f1e89"} Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.840240 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e48bfc2afbf266d53e838adb812c031a454c5200fc548b9dfed9f0fa04f1e89" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.840249 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.987848 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j"] Mar 18 16:25:38 crc kubenswrapper[4792]: E0318 16:25:38.988452 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="extract-utilities" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988475 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="extract-utilities" Mar 18 16:25:38 crc kubenswrapper[4792]: E0318 16:25:38.988524 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416434fb-7fe4-4872-9d1e-8bb317a4eab1" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988535 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="416434fb-7fe4-4872-9d1e-8bb317a4eab1" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 18 16:25:38 crc kubenswrapper[4792]: E0318 16:25:38.988561 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="extract-content" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988573 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="extract-content" Mar 18 16:25:38 crc kubenswrapper[4792]: E0318 16:25:38.988589 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="registry-server" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988596 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="registry-server" Mar 18 16:25:38 crc kubenswrapper[4792]: E0318 16:25:38.988618 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="extract-utilities" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988626 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="extract-utilities" Mar 18 16:25:38 crc kubenswrapper[4792]: E0318 16:25:38.988638 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="extract-content" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988646 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="extract-content" Mar 18 16:25:38 crc kubenswrapper[4792]: E0318 16:25:38.988662 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="registry-server" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988670 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="registry-server" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.988992 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1441a41f-e669-4930-8f58-b0c638824121" containerName="registry-server" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.989028 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c8b123-83ce-4087-a5f3-3ba212301f4c" containerName="registry-server" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.989050 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="416434fb-7fe4-4872-9d1e-8bb317a4eab1" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.990041 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.993127 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.993686 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.993962 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.994187 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6cq7" Mar 18 16:25:38 crc kubenswrapper[4792]: I0318 16:25:38.994347 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.028108 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j"] Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.122626 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.122700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.122840 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.122896 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwt2\" (UniqueName: \"kubernetes.io/projected/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-kube-api-access-9xwt2\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.122943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.225598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.225848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.226580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.226667 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwt2\" (UniqueName: \"kubernetes.io/projected/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-kube-api-access-9xwt2\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.226711 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.231456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.232284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.232942 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.243010 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.244239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xwt2\" (UniqueName: \"kubernetes.io/projected/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-kube-api-access-9xwt2\") pod \"logging-edpm-deployment-openstack-edpm-ipam-7xk4j\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.326166 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:39 crc kubenswrapper[4792]: I0318 16:25:39.876516 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j"] Mar 18 16:25:39 crc kubenswrapper[4792]: W0318 16:25:39.878665 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a780ff4_ddc1_4f8b_a4b1_f24062af5089.slice/crio-43f39d390ec255e901b8c0d58c48b35341ebb6c2a0120e8f9b01e1e135ac5b44 WatchSource:0}: Error finding container 43f39d390ec255e901b8c0d58c48b35341ebb6c2a0120e8f9b01e1e135ac5b44: Status 404 returned error can't find the container with id 43f39d390ec255e901b8c0d58c48b35341ebb6c2a0120e8f9b01e1e135ac5b44 Mar 18 16:25:40 crc kubenswrapper[4792]: I0318 16:25:40.861161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" event={"ID":"2a780ff4-ddc1-4f8b-a4b1-f24062af5089","Type":"ContainerStarted","Data":"fdaf24ab164c3ab64ba42bb8b2d5d4ca80c8c577763dd2038c978b3253b2da6c"} Mar 18 16:25:40 crc kubenswrapper[4792]: I0318 16:25:40.861522 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" event={"ID":"2a780ff4-ddc1-4f8b-a4b1-f24062af5089","Type":"ContainerStarted","Data":"43f39d390ec255e901b8c0d58c48b35341ebb6c2a0120e8f9b01e1e135ac5b44"} Mar 18 16:25:40 crc kubenswrapper[4792]: I0318 16:25:40.886951 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" podStartSLOduration=2.344548905 podStartE2EDuration="2.886929035s" podCreationTimestamp="2026-03-18 16:25:38 +0000 UTC" firstStartedPulling="2026-03-18 16:25:39.882091745 +0000 UTC m=+3088.751420682" lastFinishedPulling="2026-03-18 16:25:40.424471875 +0000 UTC m=+3089.293800812" observedRunningTime="2026-03-18 16:25:40.886271936 +0000 UTC m=+3089.755600883" watchObservedRunningTime="2026-03-18 16:25:40.886929035 +0000 UTC m=+3089.756257972" Mar 18 16:25:48 crc kubenswrapper[4792]: I0318 16:25:48.854606 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:25:48 crc kubenswrapper[4792]: E0318 16:25:48.855384 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:25:55 crc kubenswrapper[4792]: I0318 16:25:55.009947 4792 generic.go:334] "Generic (PLEG): container finished" podID="2a780ff4-ddc1-4f8b-a4b1-f24062af5089" containerID="fdaf24ab164c3ab64ba42bb8b2d5d4ca80c8c577763dd2038c978b3253b2da6c" exitCode=0 Mar 18 16:25:55 crc kubenswrapper[4792]: I0318 16:25:55.010035 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" event={"ID":"2a780ff4-ddc1-4f8b-a4b1-f24062af5089","Type":"ContainerDied","Data":"fdaf24ab164c3ab64ba42bb8b2d5d4ca80c8c577763dd2038c978b3253b2da6c"} Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.500533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.674904 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-0\") pod \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.675007 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-ssh-key-openstack-edpm-ipam\") pod \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.675068 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-inventory\") pod \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.675350 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xwt2\" (UniqueName: \"kubernetes.io/projected/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-kube-api-access-9xwt2\") pod \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.675432 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-1\") pod \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\" (UID: \"2a780ff4-ddc1-4f8b-a4b1-f24062af5089\") " Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.681991 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-kube-api-access-9xwt2" (OuterVolumeSpecName: "kube-api-access-9xwt2") pod "2a780ff4-ddc1-4f8b-a4b1-f24062af5089" (UID: "2a780ff4-ddc1-4f8b-a4b1-f24062af5089"). InnerVolumeSpecName "kube-api-access-9xwt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.707716 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "2a780ff4-ddc1-4f8b-a4b1-f24062af5089" (UID: "2a780ff4-ddc1-4f8b-a4b1-f24062af5089"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.712558 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-inventory" (OuterVolumeSpecName: "inventory") pod "2a780ff4-ddc1-4f8b-a4b1-f24062af5089" (UID: "2a780ff4-ddc1-4f8b-a4b1-f24062af5089"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.714290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a780ff4-ddc1-4f8b-a4b1-f24062af5089" (UID: "2a780ff4-ddc1-4f8b-a4b1-f24062af5089"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.718898 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "2a780ff4-ddc1-4f8b-a4b1-f24062af5089" (UID: "2a780ff4-ddc1-4f8b-a4b1-f24062af5089"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.778753 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xwt2\" (UniqueName: \"kubernetes.io/projected/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-kube-api-access-9xwt2\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.779150 4792 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.779193 4792 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.779206 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:56 crc kubenswrapper[4792]: I0318 16:25:56.779224 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a780ff4-ddc1-4f8b-a4b1-f24062af5089-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 16:25:57 crc kubenswrapper[4792]: I0318 16:25:57.033694 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" event={"ID":"2a780ff4-ddc1-4f8b-a4b1-f24062af5089","Type":"ContainerDied","Data":"43f39d390ec255e901b8c0d58c48b35341ebb6c2a0120e8f9b01e1e135ac5b44"} Mar 18 16:25:57 crc kubenswrapper[4792]: I0318 16:25:57.033737 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43f39d390ec255e901b8c0d58c48b35341ebb6c2a0120e8f9b01e1e135ac5b44" Mar 18 16:25:57 crc kubenswrapper[4792]: I0318 16:25:57.033807 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-7xk4j" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.157110 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564186-fw9hp"] Mar 18 16:26:00 crc kubenswrapper[4792]: E0318 16:26:00.159054 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a780ff4-ddc1-4f8b-a4b1-f24062af5089" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.159087 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a780ff4-ddc1-4f8b-a4b1-f24062af5089" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.159516 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a780ff4-ddc1-4f8b-a4b1-f24062af5089" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.161073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-fw9hp" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.164035 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.164264 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.164491 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.171376 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-fw9hp"] Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.266086 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4thqm\" (UniqueName: \"kubernetes.io/projected/f8a6038d-8355-4dbd-8725-6eecd2049fe7-kube-api-access-4thqm\") pod \"auto-csr-approver-29564186-fw9hp\" (UID: \"f8a6038d-8355-4dbd-8725-6eecd2049fe7\") " pod="openshift-infra/auto-csr-approver-29564186-fw9hp" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.368921 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4thqm\" (UniqueName: \"kubernetes.io/projected/f8a6038d-8355-4dbd-8725-6eecd2049fe7-kube-api-access-4thqm\") pod \"auto-csr-approver-29564186-fw9hp\" (UID: \"f8a6038d-8355-4dbd-8725-6eecd2049fe7\") " pod="openshift-infra/auto-csr-approver-29564186-fw9hp" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.389036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4thqm\" (UniqueName: \"kubernetes.io/projected/f8a6038d-8355-4dbd-8725-6eecd2049fe7-kube-api-access-4thqm\") pod \"auto-csr-approver-29564186-fw9hp\" (UID: \"f8a6038d-8355-4dbd-8725-6eecd2049fe7\") " pod="openshift-infra/auto-csr-approver-29564186-fw9hp" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.483959 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-fw9hp" Mar 18 16:26:00 crc kubenswrapper[4792]: I0318 16:26:00.855656 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:26:00 crc kubenswrapper[4792]: E0318 16:26:00.857737 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:26:01 crc kubenswrapper[4792]: I0318 16:26:01.011538 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-fw9hp"] Mar 18 16:26:01 crc kubenswrapper[4792]: I0318 16:26:01.084245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-fw9hp" event={"ID":"f8a6038d-8355-4dbd-8725-6eecd2049fe7","Type":"ContainerStarted","Data":"b3a08cd469101f2b2c69bd2d81abf2fba9742dae1cdf55ccbda329ef5a542837"} Mar 18 16:26:03 crc kubenswrapper[4792]: I0318 16:26:03.110508 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8a6038d-8355-4dbd-8725-6eecd2049fe7" containerID="c91abe37d309d6d06480473eb4eb404a017b78ba01dc201a7083c7b89fbc0491" exitCode=0 Mar 18 16:26:03 crc kubenswrapper[4792]: I0318 16:26:03.110604 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-fw9hp" event={"ID":"f8a6038d-8355-4dbd-8725-6eecd2049fe7","Type":"ContainerDied","Data":"c91abe37d309d6d06480473eb4eb404a017b78ba01dc201a7083c7b89fbc0491"} Mar 18 16:26:04 crc kubenswrapper[4792]: I0318 16:26:04.563548 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-fw9hp" Mar 18 16:26:04 crc kubenswrapper[4792]: I0318 16:26:04.696034 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4thqm\" (UniqueName: \"kubernetes.io/projected/f8a6038d-8355-4dbd-8725-6eecd2049fe7-kube-api-access-4thqm\") pod \"f8a6038d-8355-4dbd-8725-6eecd2049fe7\" (UID: \"f8a6038d-8355-4dbd-8725-6eecd2049fe7\") " Mar 18 16:26:04 crc kubenswrapper[4792]: I0318 16:26:04.701925 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a6038d-8355-4dbd-8725-6eecd2049fe7-kube-api-access-4thqm" (OuterVolumeSpecName: "kube-api-access-4thqm") pod "f8a6038d-8355-4dbd-8725-6eecd2049fe7" (UID: "f8a6038d-8355-4dbd-8725-6eecd2049fe7"). InnerVolumeSpecName "kube-api-access-4thqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:26:04 crc kubenswrapper[4792]: I0318 16:26:04.799253 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4thqm\" (UniqueName: \"kubernetes.io/projected/f8a6038d-8355-4dbd-8725-6eecd2049fe7-kube-api-access-4thqm\") on node \"crc\" DevicePath \"\"" Mar 18 16:26:05 crc kubenswrapper[4792]: I0318 16:26:05.131906 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564186-fw9hp" event={"ID":"f8a6038d-8355-4dbd-8725-6eecd2049fe7","Type":"ContainerDied","Data":"b3a08cd469101f2b2c69bd2d81abf2fba9742dae1cdf55ccbda329ef5a542837"} Mar 18 16:26:05 crc kubenswrapper[4792]: I0318 16:26:05.132548 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a08cd469101f2b2c69bd2d81abf2fba9742dae1cdf55ccbda329ef5a542837" Mar 18 16:26:05 crc kubenswrapper[4792]: I0318 16:26:05.132200 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564186-fw9hp" Mar 18 16:26:05 crc kubenswrapper[4792]: I0318 16:26:05.651433 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-66652"] Mar 18 16:26:05 crc kubenswrapper[4792]: I0318 16:26:05.660524 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564180-66652"] Mar 18 16:26:05 crc kubenswrapper[4792]: I0318 16:26:05.875749 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eeeae2e-ecbf-4682-912b-0835922d05a7" path="/var/lib/kubelet/pods/7eeeae2e-ecbf-4682-912b-0835922d05a7/volumes" Mar 18 16:26:14 crc kubenswrapper[4792]: I0318 16:26:14.855417 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:26:14 crc kubenswrapper[4792]: E0318 16:26:14.856337 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:26:26 crc kubenswrapper[4792]: I0318 16:26:26.854881 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:26:26 crc kubenswrapper[4792]: E0318 16:26:26.855686 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:26:39 crc kubenswrapper[4792]: I0318 16:26:39.854701 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:26:39 crc kubenswrapper[4792]: E0318 16:26:39.855776 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:26:49 crc kubenswrapper[4792]: I0318 16:26:49.873963 4792 scope.go:117] "RemoveContainer" containerID="f8b40ef76e5a170fb941b2fdb12acfc281e0f73ad80d1fa2211a689d1e7af66c" Mar 18 16:26:51 crc kubenswrapper[4792]: I0318 16:26:51.863657 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:26:51 crc kubenswrapper[4792]: E0318 16:26:51.865428 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:27:02 crc kubenswrapper[4792]: I0318 16:27:02.854937 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:27:02 crc kubenswrapper[4792]: E0318 16:27:02.855907 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:27:15 crc kubenswrapper[4792]: I0318 16:27:15.854239 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:27:15 crc kubenswrapper[4792]: E0318 16:27:15.855146 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:27:28 crc kubenswrapper[4792]: I0318 16:27:28.854908 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:27:28 crc kubenswrapper[4792]: E0318 16:27:28.855859 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:27:42 crc kubenswrapper[4792]: I0318 16:27:42.854397 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:27:42 crc kubenswrapper[4792]: E0318 16:27:42.855234 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:27:53 crc kubenswrapper[4792]: I0318 16:27:53.854689 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:27:53 crc kubenswrapper[4792]: E0318 16:27:53.856960 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.199554 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564188-q4mrh"] Mar 18 16:28:00 crc kubenswrapper[4792]: E0318 16:28:00.200700 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a6038d-8355-4dbd-8725-6eecd2049fe7" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.200720 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a6038d-8355-4dbd-8725-6eecd2049fe7" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.201074 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a6038d-8355-4dbd-8725-6eecd2049fe7" containerName="oc" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.202143 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-q4mrh" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.204453 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.204467 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.204858 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.259619 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-q4mrh"] Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.336102 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s55\" (UniqueName: \"kubernetes.io/projected/1459d206-6e86-4f94-9c2c-03ad754c89b5-kube-api-access-h8s55\") pod \"auto-csr-approver-29564188-q4mrh\" (UID: \"1459d206-6e86-4f94-9c2c-03ad754c89b5\") " pod="openshift-infra/auto-csr-approver-29564188-q4mrh" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.438764 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s55\" (UniqueName: \"kubernetes.io/projected/1459d206-6e86-4f94-9c2c-03ad754c89b5-kube-api-access-h8s55\") pod \"auto-csr-approver-29564188-q4mrh\" (UID: \"1459d206-6e86-4f94-9c2c-03ad754c89b5\") " pod="openshift-infra/auto-csr-approver-29564188-q4mrh" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.458859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s55\" (UniqueName: \"kubernetes.io/projected/1459d206-6e86-4f94-9c2c-03ad754c89b5-kube-api-access-h8s55\") pod \"auto-csr-approver-29564188-q4mrh\" (UID: \"1459d206-6e86-4f94-9c2c-03ad754c89b5\") " pod="openshift-infra/auto-csr-approver-29564188-q4mrh" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.524781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-q4mrh" Mar 18 16:28:00 crc kubenswrapper[4792]: I0318 16:28:00.990675 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-q4mrh"] Mar 18 16:28:01 crc kubenswrapper[4792]: I0318 16:28:01.446821 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-q4mrh" event={"ID":"1459d206-6e86-4f94-9c2c-03ad754c89b5","Type":"ContainerStarted","Data":"8677af0a43a905ccaaa70007259849d202b8c49f9f626696ab75782ca9e5de7c"} Mar 18 16:28:03 crc kubenswrapper[4792]: I0318 16:28:03.468242 4792 generic.go:334] "Generic (PLEG): container finished" podID="1459d206-6e86-4f94-9c2c-03ad754c89b5" containerID="54405b664c580b070a52e89abcde2d8c292b197a715727682e379404aa4c13fe" exitCode=0 Mar 18 16:28:03 crc kubenswrapper[4792]: I0318 16:28:03.468708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-q4mrh" event={"ID":"1459d206-6e86-4f94-9c2c-03ad754c89b5","Type":"ContainerDied","Data":"54405b664c580b070a52e89abcde2d8c292b197a715727682e379404aa4c13fe"} Mar 18 16:28:04 crc kubenswrapper[4792]: I0318 16:28:04.857390 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:28:04 crc kubenswrapper[4792]: E0318 16:28:04.857907 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:28:04 crc kubenswrapper[4792]: I0318 16:28:04.913949 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-q4mrh" Mar 18 16:28:05 crc kubenswrapper[4792]: I0318 16:28:05.067149 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s55\" (UniqueName: \"kubernetes.io/projected/1459d206-6e86-4f94-9c2c-03ad754c89b5-kube-api-access-h8s55\") pod \"1459d206-6e86-4f94-9c2c-03ad754c89b5\" (UID: \"1459d206-6e86-4f94-9c2c-03ad754c89b5\") " Mar 18 16:28:05 crc kubenswrapper[4792]: I0318 16:28:05.073022 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1459d206-6e86-4f94-9c2c-03ad754c89b5-kube-api-access-h8s55" (OuterVolumeSpecName: "kube-api-access-h8s55") pod "1459d206-6e86-4f94-9c2c-03ad754c89b5" (UID: "1459d206-6e86-4f94-9c2c-03ad754c89b5"). InnerVolumeSpecName "kube-api-access-h8s55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:28:05 crc kubenswrapper[4792]: I0318 16:28:05.171456 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8s55\" (UniqueName: \"kubernetes.io/projected/1459d206-6e86-4f94-9c2c-03ad754c89b5-kube-api-access-h8s55\") on node \"crc\" DevicePath \"\"" Mar 18 16:28:05 crc kubenswrapper[4792]: I0318 16:28:05.489718 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564188-q4mrh" event={"ID":"1459d206-6e86-4f94-9c2c-03ad754c89b5","Type":"ContainerDied","Data":"8677af0a43a905ccaaa70007259849d202b8c49f9f626696ab75782ca9e5de7c"} Mar 18 16:28:05 crc kubenswrapper[4792]: I0318 16:28:05.489758 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8677af0a43a905ccaaa70007259849d202b8c49f9f626696ab75782ca9e5de7c" Mar 18 16:28:05 crc kubenswrapper[4792]: I0318 16:28:05.489828 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564188-q4mrh" Mar 18 16:28:05 crc kubenswrapper[4792]: I0318 16:28:05.998778 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-p8crf"] Mar 18 16:28:06 crc kubenswrapper[4792]: I0318 16:28:06.010904 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564182-p8crf"] Mar 18 16:28:07 crc kubenswrapper[4792]: I0318 16:28:07.867098 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c0482d-5b4f-41f8-b10a-817c7f129804" path="/var/lib/kubelet/pods/82c0482d-5b4f-41f8-b10a-817c7f129804/volumes" Mar 18 16:28:17 crc kubenswrapper[4792]: I0318 16:28:17.854813 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:28:17 crc kubenswrapper[4792]: E0318 16:28:17.858230 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:28:28 crc kubenswrapper[4792]: I0318 16:28:28.854145 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:28:28 crc kubenswrapper[4792]: E0318 16:28:28.856239 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:28:42 crc kubenswrapper[4792]: I0318 16:28:42.855121 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:28:43 crc kubenswrapper[4792]: I0318 16:28:43.963583 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"322f1fc2f95037f713b3fbe6f082e8c10e13e85495957fd2f319f4c0d42b5a1c"} Mar 18 16:28:50 crc kubenswrapper[4792]: I0318 16:28:50.008501 4792 scope.go:117] "RemoveContainer" containerID="c8ee1e138e5b3d4c65f557e82fb155486fa6e7629c21c09269d1bd9c9293bf8a" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.150680 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564190-td8gt"] Mar 18 16:30:00 crc kubenswrapper[4792]: E0318 16:30:00.151757 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1459d206-6e86-4f94-9c2c-03ad754c89b5" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.151773 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1459d206-6e86-4f94-9c2c-03ad754c89b5" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.152158 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1459d206-6e86-4f94-9c2c-03ad754c89b5" containerName="oc" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.153005 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-td8gt" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.155575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.155609 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.156529 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.165279 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5"] Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.167997 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.173222 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.173418 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.185836 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5"] Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.197690 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-td8gt"] Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.261286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhnf\" (UniqueName: \"kubernetes.io/projected/f4e6bf8a-3133-4a80-b99f-605b2a7b15e0-kube-api-access-dfhnf\") pod \"auto-csr-approver-29564190-td8gt\" (UID: \"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0\") " pod="openshift-infra/auto-csr-approver-29564190-td8gt" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.261357 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a8b9640-2056-47b6-9982-b0feea515131-config-volume\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.261470 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5pv2\" (UniqueName: \"kubernetes.io/projected/7a8b9640-2056-47b6-9982-b0feea515131-kube-api-access-l5pv2\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.261622 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a8b9640-2056-47b6-9982-b0feea515131-secret-volume\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.364224 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5pv2\" (UniqueName: \"kubernetes.io/projected/7a8b9640-2056-47b6-9982-b0feea515131-kube-api-access-l5pv2\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.364393 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a8b9640-2056-47b6-9982-b0feea515131-secret-volume\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.364592 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhnf\" (UniqueName: \"kubernetes.io/projected/f4e6bf8a-3133-4a80-b99f-605b2a7b15e0-kube-api-access-dfhnf\") pod \"auto-csr-approver-29564190-td8gt\" (UID: \"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0\") " pod="openshift-infra/auto-csr-approver-29564190-td8gt" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.364629 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a8b9640-2056-47b6-9982-b0feea515131-config-volume\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.366153 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a8b9640-2056-47b6-9982-b0feea515131-config-volume\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.373141 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a8b9640-2056-47b6-9982-b0feea515131-secret-volume\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.383606 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhnf\" (UniqueName: \"kubernetes.io/projected/f4e6bf8a-3133-4a80-b99f-605b2a7b15e0-kube-api-access-dfhnf\") pod \"auto-csr-approver-29564190-td8gt\" (UID: \"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0\") " pod="openshift-infra/auto-csr-approver-29564190-td8gt" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.384941 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5pv2\" (UniqueName: \"kubernetes.io/projected/7a8b9640-2056-47b6-9982-b0feea515131-kube-api-access-l5pv2\") pod \"collect-profiles-29564190-z2pc5\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.474078 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-td8gt" Mar 18 16:30:00 crc kubenswrapper[4792]: I0318 16:30:00.491584 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:01 crc kubenswrapper[4792]: I0318 16:30:01.020604 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:30:01 crc kubenswrapper[4792]: I0318 16:30:01.025508 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-td8gt"] Mar 18 16:30:01 crc kubenswrapper[4792]: I0318 16:30:01.043336 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5"] Mar 18 16:30:01 crc kubenswrapper[4792]: I0318 16:30:01.848159 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a8b9640-2056-47b6-9982-b0feea515131" containerID="88ee34d1fbe4b8b44f4ff64f15ba5e687dd7fb58f1c895047874626a2a4ff265" exitCode=0 Mar 18 16:30:01 crc kubenswrapper[4792]: I0318 16:30:01.848268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" event={"ID":"7a8b9640-2056-47b6-9982-b0feea515131","Type":"ContainerDied","Data":"88ee34d1fbe4b8b44f4ff64f15ba5e687dd7fb58f1c895047874626a2a4ff265"} Mar 18 16:30:01 crc kubenswrapper[4792]: I0318 16:30:01.848593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" event={"ID":"7a8b9640-2056-47b6-9982-b0feea515131","Type":"ContainerStarted","Data":"6b5ff64b469a18d8cb42aa25a0cd8a686da1a9f89d120d76c277eab340012d62"} Mar 18 16:30:01 crc kubenswrapper[4792]: I0318 16:30:01.891442 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-td8gt" event={"ID":"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0","Type":"ContainerStarted","Data":"b19963ae312028bfa5a2a59e8f022142db8c1e8b58acf5e2c27a135cdcc6c85c"} Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.337281 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.350634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a8b9640-2056-47b6-9982-b0feea515131-secret-volume\") pod \"7a8b9640-2056-47b6-9982-b0feea515131\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.350843 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a8b9640-2056-47b6-9982-b0feea515131-config-volume\") pod \"7a8b9640-2056-47b6-9982-b0feea515131\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.350955 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5pv2\" (UniqueName: \"kubernetes.io/projected/7a8b9640-2056-47b6-9982-b0feea515131-kube-api-access-l5pv2\") pod \"7a8b9640-2056-47b6-9982-b0feea515131\" (UID: \"7a8b9640-2056-47b6-9982-b0feea515131\") " Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.353232 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8b9640-2056-47b6-9982-b0feea515131-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a8b9640-2056-47b6-9982-b0feea515131" (UID: "7a8b9640-2056-47b6-9982-b0feea515131"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.360285 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8b9640-2056-47b6-9982-b0feea515131-kube-api-access-l5pv2" (OuterVolumeSpecName: "kube-api-access-l5pv2") pod "7a8b9640-2056-47b6-9982-b0feea515131" (UID: "7a8b9640-2056-47b6-9982-b0feea515131"). InnerVolumeSpecName "kube-api-access-l5pv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.360314 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8b9640-2056-47b6-9982-b0feea515131-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a8b9640-2056-47b6-9982-b0feea515131" (UID: "7a8b9640-2056-47b6-9982-b0feea515131"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.455332 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a8b9640-2056-47b6-9982-b0feea515131-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.455384 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5pv2\" (UniqueName: \"kubernetes.io/projected/7a8b9640-2056-47b6-9982-b0feea515131-kube-api-access-l5pv2\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.455398 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a8b9640-2056-47b6-9982-b0feea515131-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.879316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" event={"ID":"7a8b9640-2056-47b6-9982-b0feea515131","Type":"ContainerDied","Data":"6b5ff64b469a18d8cb42aa25a0cd8a686da1a9f89d120d76c277eab340012d62"} Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.879693 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b5ff64b469a18d8cb42aa25a0cd8a686da1a9f89d120d76c277eab340012d62" Mar 18 16:30:03 crc kubenswrapper[4792]: I0318 16:30:03.879728 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5" Mar 18 16:30:04 crc kubenswrapper[4792]: I0318 16:30:04.421744 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f"] Mar 18 16:30:04 crc kubenswrapper[4792]: I0318 16:30:04.443208 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564145-dbk6f"] Mar 18 16:30:04 crc kubenswrapper[4792]: I0318 16:30:04.892802 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4e6bf8a-3133-4a80-b99f-605b2a7b15e0" containerID="4ddc4469a1c2630c2352d2e4b873fbef7044344333742ecf05cd3b7557465ab2" exitCode=0 Mar 18 16:30:04 crc kubenswrapper[4792]: I0318 16:30:04.892910 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-td8gt" event={"ID":"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0","Type":"ContainerDied","Data":"4ddc4469a1c2630c2352d2e4b873fbef7044344333742ecf05cd3b7557465ab2"} Mar 18 16:30:05 crc kubenswrapper[4792]: I0318 16:30:05.870117 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f59420-9ada-4d61-a361-e8366afd90e6" path="/var/lib/kubelet/pods/a2f59420-9ada-4d61-a361-e8366afd90e6/volumes" Mar 18 16:30:06 crc kubenswrapper[4792]: I0318 16:30:06.429077 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-td8gt" Mar 18 16:30:06 crc kubenswrapper[4792]: I0318 16:30:06.548693 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfhnf\" (UniqueName: \"kubernetes.io/projected/f4e6bf8a-3133-4a80-b99f-605b2a7b15e0-kube-api-access-dfhnf\") pod \"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0\" (UID: \"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0\") " Mar 18 16:30:06 crc kubenswrapper[4792]: I0318 16:30:06.555471 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e6bf8a-3133-4a80-b99f-605b2a7b15e0-kube-api-access-dfhnf" (OuterVolumeSpecName: "kube-api-access-dfhnf") pod "f4e6bf8a-3133-4a80-b99f-605b2a7b15e0" (UID: "f4e6bf8a-3133-4a80-b99f-605b2a7b15e0"). InnerVolumeSpecName "kube-api-access-dfhnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:30:06 crc kubenswrapper[4792]: I0318 16:30:06.652684 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfhnf\" (UniqueName: \"kubernetes.io/projected/f4e6bf8a-3133-4a80-b99f-605b2a7b15e0-kube-api-access-dfhnf\") on node \"crc\" DevicePath \"\"" Mar 18 16:30:06 crc kubenswrapper[4792]: I0318 16:30:06.913169 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564190-td8gt" event={"ID":"f4e6bf8a-3133-4a80-b99f-605b2a7b15e0","Type":"ContainerDied","Data":"b19963ae312028bfa5a2a59e8f022142db8c1e8b58acf5e2c27a135cdcc6c85c"} Mar 18 16:30:06 crc kubenswrapper[4792]: I0318 16:30:06.913212 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19963ae312028bfa5a2a59e8f022142db8c1e8b58acf5e2c27a135cdcc6c85c" Mar 18 16:30:06 crc kubenswrapper[4792]: I0318 16:30:06.913269 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564190-td8gt" Mar 18 16:30:07 crc kubenswrapper[4792]: I0318 16:30:07.493266 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-fnkcq"] Mar 18 16:30:07 crc kubenswrapper[4792]: I0318 16:30:07.507953 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564184-fnkcq"] Mar 18 16:30:07 crc kubenswrapper[4792]: I0318 16:30:07.868291 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9fb57e-2f97-4c16-954f-a4dc3276dd58" path="/var/lib/kubelet/pods/6c9fb57e-2f97-4c16-954f-a4dc3276dd58/volumes" Mar 18 16:30:50 crc kubenswrapper[4792]: I0318 16:30:50.137060 4792 scope.go:117] "RemoveContainer" containerID="ff31ee7a894b16e3dede3cc112abed401933798a935188e84a112ce88177e08c" Mar 18 16:30:50 crc kubenswrapper[4792]: I0318 16:30:50.191691 4792 scope.go:117] "RemoveContainer" containerID="3263e5b19059da6522a66b3c91147b8f9ac5e6ca77078eaec1621dcdc2ba4f69" Mar 18 16:31:00 crc kubenswrapper[4792]: I0318 16:31:00.321379 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:31:00 crc kubenswrapper[4792]: I0318 16:31:00.321853 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:31:30 crc kubenswrapper[4792]: I0318 16:31:30.322535 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:31:30 crc kubenswrapper[4792]: I0318 16:31:30.323189 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.366305 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-chj47"] Mar 18 16:31:35 crc kubenswrapper[4792]: E0318 16:31:35.367559 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8b9640-2056-47b6-9982-b0feea515131" containerName="collect-profiles" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.367580 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8b9640-2056-47b6-9982-b0feea515131" containerName="collect-profiles" Mar 18 16:31:35 crc kubenswrapper[4792]: E0318 16:31:35.367643 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e6bf8a-3133-4a80-b99f-605b2a7b15e0" containerName="oc" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.367656 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e6bf8a-3133-4a80-b99f-605b2a7b15e0" containerName="oc" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.367962 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e6bf8a-3133-4a80-b99f-605b2a7b15e0" containerName="oc" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.368002 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8b9640-2056-47b6-9982-b0feea515131" containerName="collect-profiles" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.369907 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.385015 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-chj47"] Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.495738 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6ls\" (UniqueName: \"kubernetes.io/projected/143df0e5-40e9-4536-8285-509497426831-kube-api-access-6n6ls\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.495929 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143df0e5-40e9-4536-8285-509497426831-catalog-content\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.496006 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143df0e5-40e9-4536-8285-509497426831-utilities\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.598098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6ls\" (UniqueName: \"kubernetes.io/projected/143df0e5-40e9-4536-8285-509497426831-kube-api-access-6n6ls\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.598329 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143df0e5-40e9-4536-8285-509497426831-catalog-content\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.598411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143df0e5-40e9-4536-8285-509497426831-utilities\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.599138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/143df0e5-40e9-4536-8285-509497426831-catalog-content\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.600445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/143df0e5-40e9-4536-8285-509497426831-utilities\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.649749 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6ls\" (UniqueName: \"kubernetes.io/projected/143df0e5-40e9-4536-8285-509497426831-kube-api-access-6n6ls\") pod \"redhat-operators-chj47\" (UID: \"143df0e5-40e9-4536-8285-509497426831\") " pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:35 crc kubenswrapper[4792]: I0318 16:31:35.696622 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:31:36 crc kubenswrapper[4792]: I0318 16:31:36.230155 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-chj47"] Mar 18 16:31:36 crc kubenswrapper[4792]: I0318 16:31:36.692825 4792 generic.go:334] "Generic (PLEG): container finished" podID="143df0e5-40e9-4536-8285-509497426831" containerID="b3633e0f75045b0a59335846f63b76edfe9c8fd4057656ec2292597b778e4d9e" exitCode=0 Mar 18 16:31:36 crc kubenswrapper[4792]: I0318 16:31:36.692897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chj47" event={"ID":"143df0e5-40e9-4536-8285-509497426831","Type":"ContainerDied","Data":"b3633e0f75045b0a59335846f63b76edfe9c8fd4057656ec2292597b778e4d9e"} Mar 18 16:31:36 crc kubenswrapper[4792]: I0318 16:31:36.693098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chj47" event={"ID":"143df0e5-40e9-4536-8285-509497426831","Type":"ContainerStarted","Data":"995d824aac053ac565b5fd03d9fd45f83db63ef376e55b7175b1f95eaadda016"} Mar 18 16:31:47 crc kubenswrapper[4792]: I0318 16:31:47.804797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chj47" event={"ID":"143df0e5-40e9-4536-8285-509497426831","Type":"ContainerStarted","Data":"4a0009cfbc63a18ca8e1f969004bdf8ccd722add3fdc3e9fb4184007ec1e9cba"} Mar 18 16:31:54 crc kubenswrapper[4792]: I0318 16:31:54.889291 4792 generic.go:334] "Generic (PLEG): container finished" podID="143df0e5-40e9-4536-8285-509497426831" containerID="4a0009cfbc63a18ca8e1f969004bdf8ccd722add3fdc3e9fb4184007ec1e9cba" exitCode=0 Mar 18 16:31:54 crc kubenswrapper[4792]: I0318 16:31:54.889415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chj47" event={"ID":"143df0e5-40e9-4536-8285-509497426831","Type":"ContainerDied","Data":"4a0009cfbc63a18ca8e1f969004bdf8ccd722add3fdc3e9fb4184007ec1e9cba"} Mar 18 16:31:55 crc kubenswrapper[4792]: I0318 16:31:55.904789 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chj47" event={"ID":"143df0e5-40e9-4536-8285-509497426831","Type":"ContainerStarted","Data":"2f3304d43023cb2cabe774f18192b7c3c202dd945a2d432238198a6bd650bc00"} Mar 18 16:31:55 crc kubenswrapper[4792]: I0318 16:31:55.927298 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-chj47" podStartSLOduration=2.123879091 podStartE2EDuration="20.92727551s" podCreationTimestamp="2026-03-18 16:31:35 +0000 UTC" firstStartedPulling="2026-03-18 16:31:36.694619382 +0000 UTC m=+3445.563948319" lastFinishedPulling="2026-03-18 16:31:55.498015781 +0000 UTC m=+3464.367344738" observedRunningTime="2026-03-18 16:31:55.923055557 +0000 UTC m=+3464.792384494" watchObservedRunningTime="2026-03-18 16:31:55.92727551 +0000 UTC m=+3464.796604447" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.163058 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564192-zfrht"] Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.166494 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-zfrht" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.172439 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.172735 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.178944 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.182870 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-zfrht"] Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.284196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m5n9\" (UniqueName: \"kubernetes.io/projected/33676ab0-bcc8-413d-9f28-7c6264a3cb5c-kube-api-access-5m5n9\") pod \"auto-csr-approver-29564192-zfrht\" (UID: \"33676ab0-bcc8-413d-9f28-7c6264a3cb5c\") " pod="openshift-infra/auto-csr-approver-29564192-zfrht" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.321478 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.321560 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.321618 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.322756 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"322f1fc2f95037f713b3fbe6f082e8c10e13e85495957fd2f319f4c0d42b5a1c"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.322822 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://322f1fc2f95037f713b3fbe6f082e8c10e13e85495957fd2f319f4c0d42b5a1c" gracePeriod=600 Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.387699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m5n9\" (UniqueName: \"kubernetes.io/projected/33676ab0-bcc8-413d-9f28-7c6264a3cb5c-kube-api-access-5m5n9\") pod \"auto-csr-approver-29564192-zfrht\" (UID: \"33676ab0-bcc8-413d-9f28-7c6264a3cb5c\") " pod="openshift-infra/auto-csr-approver-29564192-zfrht" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.408360 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m5n9\" (UniqueName: \"kubernetes.io/projected/33676ab0-bcc8-413d-9f28-7c6264a3cb5c-kube-api-access-5m5n9\") pod \"auto-csr-approver-29564192-zfrht\" (UID: \"33676ab0-bcc8-413d-9f28-7c6264a3cb5c\") " pod="openshift-infra/auto-csr-approver-29564192-zfrht" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.499546 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-zfrht" Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.969680 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="322f1fc2f95037f713b3fbe6f082e8c10e13e85495957fd2f319f4c0d42b5a1c" exitCode=0 Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.969704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"322f1fc2f95037f713b3fbe6f082e8c10e13e85495957fd2f319f4c0d42b5a1c"} Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.970662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7"} Mar 18 16:32:00 crc kubenswrapper[4792]: I0318 16:32:00.970694 4792 scope.go:117] "RemoveContainer" containerID="6a1f82c3fb865c2f9a7ecf444ce19f1da15a2660a0c1556dafaad95aebf25247" Mar 18 16:32:01 crc kubenswrapper[4792]: I0318 16:32:01.067778 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-zfrht"] Mar 18 16:32:01 crc kubenswrapper[4792]: I0318 16:32:01.982646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-zfrht" event={"ID":"33676ab0-bcc8-413d-9f28-7c6264a3cb5c","Type":"ContainerStarted","Data":"48c210a72ac7ae65be06f4f8f86cca2846f63a9a3edfbcc2f20d963e5f24c2aa"} Mar 18 16:32:03 crc kubenswrapper[4792]: I0318 16:32:03.040933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-zfrht" event={"ID":"33676ab0-bcc8-413d-9f28-7c6264a3cb5c","Type":"ContainerStarted","Data":"5d2ef11206240159da58db53b302afebc6770cc0e1085364bf1c95af9c100f44"} Mar 18 16:32:03 crc kubenswrapper[4792]: I0318 16:32:03.072381 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564192-zfrht" podStartSLOduration=1.839113365 podStartE2EDuration="3.072229744s" podCreationTimestamp="2026-03-18 16:32:00 +0000 UTC" firstStartedPulling="2026-03-18 16:32:01.068154866 +0000 UTC m=+3469.937483803" lastFinishedPulling="2026-03-18 16:32:02.301271255 +0000 UTC m=+3471.170600182" observedRunningTime="2026-03-18 16:32:03.063300084 +0000 UTC m=+3471.932629041" watchObservedRunningTime="2026-03-18 16:32:03.072229744 +0000 UTC m=+3471.941558681" Mar 18 16:32:04 crc kubenswrapper[4792]: I0318 16:32:04.052338 4792 generic.go:334] "Generic (PLEG): container finished" podID="33676ab0-bcc8-413d-9f28-7c6264a3cb5c" containerID="5d2ef11206240159da58db53b302afebc6770cc0e1085364bf1c95af9c100f44" exitCode=0 Mar 18 16:32:04 crc kubenswrapper[4792]: I0318 16:32:04.052712 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-zfrht" event={"ID":"33676ab0-bcc8-413d-9f28-7c6264a3cb5c","Type":"ContainerDied","Data":"5d2ef11206240159da58db53b302afebc6770cc0e1085364bf1c95af9c100f44"} Mar 18 16:32:05 crc kubenswrapper[4792]: I0318 16:32:05.551660 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-zfrht" Mar 18 16:32:05 crc kubenswrapper[4792]: I0318 16:32:05.628559 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m5n9\" (UniqueName: \"kubernetes.io/projected/33676ab0-bcc8-413d-9f28-7c6264a3cb5c-kube-api-access-5m5n9\") pod \"33676ab0-bcc8-413d-9f28-7c6264a3cb5c\" (UID: \"33676ab0-bcc8-413d-9f28-7c6264a3cb5c\") " Mar 18 16:32:05 crc kubenswrapper[4792]: I0318 16:32:05.648733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33676ab0-bcc8-413d-9f28-7c6264a3cb5c-kube-api-access-5m5n9" (OuterVolumeSpecName: "kube-api-access-5m5n9") pod "33676ab0-bcc8-413d-9f28-7c6264a3cb5c" (UID: "33676ab0-bcc8-413d-9f28-7c6264a3cb5c"). InnerVolumeSpecName "kube-api-access-5m5n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:32:05 crc kubenswrapper[4792]: I0318 16:32:05.697550 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:32:05 crc kubenswrapper[4792]: I0318 16:32:05.697604 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:32:05 crc kubenswrapper[4792]: I0318 16:32:05.732358 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m5n9\" (UniqueName: \"kubernetes.io/projected/33676ab0-bcc8-413d-9f28-7c6264a3cb5c-kube-api-access-5m5n9\") on node \"crc\" DevicePath \"\"" Mar 18 16:32:06 crc kubenswrapper[4792]: I0318 16:32:06.074599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564192-zfrht" event={"ID":"33676ab0-bcc8-413d-9f28-7c6264a3cb5c","Type":"ContainerDied","Data":"48c210a72ac7ae65be06f4f8f86cca2846f63a9a3edfbcc2f20d963e5f24c2aa"} Mar 18 16:32:06 crc kubenswrapper[4792]: I0318 16:32:06.074640 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48c210a72ac7ae65be06f4f8f86cca2846f63a9a3edfbcc2f20d963e5f24c2aa" Mar 18 16:32:06 crc kubenswrapper[4792]: I0318 16:32:06.074653 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564192-zfrht" Mar 18 16:32:06 crc kubenswrapper[4792]: I0318 16:32:06.144623 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-fw9hp"] Mar 18 16:32:06 crc kubenswrapper[4792]: I0318 16:32:06.157196 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564186-fw9hp"] Mar 18 16:32:06 crc kubenswrapper[4792]: I0318 16:32:06.744711 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:32:06 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:32:06 crc kubenswrapper[4792]: > Mar 18 16:32:07 crc kubenswrapper[4792]: I0318 16:32:07.868354 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a6038d-8355-4dbd-8725-6eecd2049fe7" path="/var/lib/kubelet/pods/f8a6038d-8355-4dbd-8725-6eecd2049fe7/volumes" Mar 18 16:32:16 crc kubenswrapper[4792]: I0318 16:32:16.747358 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:32:16 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:32:16 crc kubenswrapper[4792]: > Mar 18 16:32:26 crc kubenswrapper[4792]: I0318 16:32:26.747693 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:32:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:32:26 crc kubenswrapper[4792]: > Mar 18 16:32:36 crc kubenswrapper[4792]: I0318 16:32:36.756034 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:32:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:32:36 crc kubenswrapper[4792]: > Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.152085 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9mng7"] Mar 18 16:32:41 crc kubenswrapper[4792]: E0318 16:32:41.153480 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33676ab0-bcc8-413d-9f28-7c6264a3cb5c" containerName="oc" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.153502 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="33676ab0-bcc8-413d-9f28-7c6264a3cb5c" containerName="oc" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.153912 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="33676ab0-bcc8-413d-9f28-7c6264a3cb5c" containerName="oc" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.156192 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.175934 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mng7"] Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.249442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84kwc\" (UniqueName: \"kubernetes.io/projected/e59e5e47-020b-4052-9b61-9bbfb477efb5-kube-api-access-84kwc\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.249824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-catalog-content\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.250464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-utilities\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.353179 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-utilities\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.353293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84kwc\" (UniqueName: \"kubernetes.io/projected/e59e5e47-020b-4052-9b61-9bbfb477efb5-kube-api-access-84kwc\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.353356 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-catalog-content\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.353984 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-utilities\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.354091 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-catalog-content\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.377195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84kwc\" (UniqueName: \"kubernetes.io/projected/e59e5e47-020b-4052-9b61-9bbfb477efb5-kube-api-access-84kwc\") pod \"redhat-marketplace-9mng7\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:41 crc kubenswrapper[4792]: I0318 16:32:41.494118 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:42 crc kubenswrapper[4792]: I0318 16:32:42.028554 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mng7"] Mar 18 16:32:42 crc kubenswrapper[4792]: I0318 16:32:42.484613 4792 generic.go:334] "Generic (PLEG): container finished" podID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerID="1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621" exitCode=0 Mar 18 16:32:42 crc kubenswrapper[4792]: I0318 16:32:42.484664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mng7" event={"ID":"e59e5e47-020b-4052-9b61-9bbfb477efb5","Type":"ContainerDied","Data":"1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621"} Mar 18 16:32:42 crc kubenswrapper[4792]: I0318 16:32:42.484715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mng7" event={"ID":"e59e5e47-020b-4052-9b61-9bbfb477efb5","Type":"ContainerStarted","Data":"361f98aa6319fedfd03f89c441dd8a032550aeade422b5cc76bb46248ae03959"} Mar 18 16:32:44 crc kubenswrapper[4792]: I0318 16:32:44.512458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mng7" event={"ID":"e59e5e47-020b-4052-9b61-9bbfb477efb5","Type":"ContainerStarted","Data":"9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8"} Mar 18 16:32:45 crc kubenswrapper[4792]: I0318 16:32:45.742652 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:32:45 crc kubenswrapper[4792]: I0318 16:32:45.791680 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-chj47" Mar 18 16:32:46 crc kubenswrapper[4792]: I0318 16:32:46.339283 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-chj47"] Mar 18 16:32:46 crc kubenswrapper[4792]: I0318 16:32:46.551074 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vp7sb"] Mar 18 16:32:46 crc kubenswrapper[4792]: I0318 16:32:46.551378 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vp7sb" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="registry-server" containerID="cri-o://d1ca4c6e10e20d178e271c3882dbe33690af31be007849b0397258e893b23bc2" gracePeriod=2 Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.595855 4792 generic.go:334] "Generic (PLEG): container finished" podID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerID="9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8" exitCode=0 Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.595927 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mng7" event={"ID":"e59e5e47-020b-4052-9b61-9bbfb477efb5","Type":"ContainerDied","Data":"9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8"} Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.602742 4792 generic.go:334] "Generic (PLEG): container finished" podID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerID="d1ca4c6e10e20d178e271c3882dbe33690af31be007849b0397258e893b23bc2" exitCode=0 Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.603064 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vp7sb" event={"ID":"a8a131ee-9ed6-4243-b856-c688d4fd9b89","Type":"ContainerDied","Data":"d1ca4c6e10e20d178e271c3882dbe33690af31be007849b0397258e893b23bc2"} Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.891883 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.950399 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-catalog-content\") pod \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.950444 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-utilities\") pod \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.950595 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dknl\" (UniqueName: \"kubernetes.io/projected/a8a131ee-9ed6-4243-b856-c688d4fd9b89-kube-api-access-8dknl\") pod \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\" (UID: \"a8a131ee-9ed6-4243-b856-c688d4fd9b89\") " Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.952448 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-utilities" (OuterVolumeSpecName: "utilities") pod "a8a131ee-9ed6-4243-b856-c688d4fd9b89" (UID: "a8a131ee-9ed6-4243-b856-c688d4fd9b89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.953061 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:32:47 crc kubenswrapper[4792]: I0318 16:32:47.956186 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a131ee-9ed6-4243-b856-c688d4fd9b89-kube-api-access-8dknl" (OuterVolumeSpecName: "kube-api-access-8dknl") pod "a8a131ee-9ed6-4243-b856-c688d4fd9b89" (UID: "a8a131ee-9ed6-4243-b856-c688d4fd9b89"). InnerVolumeSpecName "kube-api-access-8dknl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.055454 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dknl\" (UniqueName: \"kubernetes.io/projected/a8a131ee-9ed6-4243-b856-c688d4fd9b89-kube-api-access-8dknl\") on node \"crc\" DevicePath \"\"" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.083290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8a131ee-9ed6-4243-b856-c688d4fd9b89" (UID: "a8a131ee-9ed6-4243-b856-c688d4fd9b89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.157582 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a131ee-9ed6-4243-b856-c688d4fd9b89-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.616066 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vp7sb" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.616058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vp7sb" event={"ID":"a8a131ee-9ed6-4243-b856-c688d4fd9b89","Type":"ContainerDied","Data":"218ebac793635765d1daf991b0164bb238b5d0abfd7f5c2bc22514eb75176a0e"} Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.616603 4792 scope.go:117] "RemoveContainer" containerID="d1ca4c6e10e20d178e271c3882dbe33690af31be007849b0397258e893b23bc2" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.618877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mng7" event={"ID":"e59e5e47-020b-4052-9b61-9bbfb477efb5","Type":"ContainerStarted","Data":"3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed"} Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.688612 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9mng7" podStartSLOduration=1.9867488469999999 podStartE2EDuration="7.688582876s" podCreationTimestamp="2026-03-18 16:32:41 +0000 UTC" firstStartedPulling="2026-03-18 16:32:42.487035876 +0000 UTC m=+3511.356364813" lastFinishedPulling="2026-03-18 16:32:48.188869905 +0000 UTC m=+3517.058198842" observedRunningTime="2026-03-18 16:32:48.676422644 +0000 UTC m=+3517.545751601" watchObservedRunningTime="2026-03-18 16:32:48.688582876 +0000 UTC m=+3517.557911813" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.696626 4792 scope.go:117] "RemoveContainer" containerID="11bcfe2e652fb3b0ba293d65c6d50de8f4ba1033b18f5dcc2476a348640124ea" Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.736671 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vp7sb"] Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.761230 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vp7sb"] Mar 18 16:32:48 crc kubenswrapper[4792]: I0318 16:32:48.787199 4792 scope.go:117] "RemoveContainer" containerID="25c1c7823d5d3b5f3c0327d275bfaed6d6aaf2a721768e8b2ac3919a1b2e8623" Mar 18 16:32:49 crc kubenswrapper[4792]: I0318 16:32:49.878692 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" path="/var/lib/kubelet/pods/a8a131ee-9ed6-4243-b856-c688d4fd9b89/volumes" Mar 18 16:32:50 crc kubenswrapper[4792]: I0318 16:32:50.363319 4792 scope.go:117] "RemoveContainer" containerID="c91abe37d309d6d06480473eb4eb404a017b78ba01dc201a7083c7b89fbc0491" Mar 18 16:32:51 crc kubenswrapper[4792]: I0318 16:32:51.495300 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:51 crc kubenswrapper[4792]: I0318 16:32:51.495683 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:32:51 crc kubenswrapper[4792]: I0318 16:32:51.554929 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:33:01 crc kubenswrapper[4792]: I0318 16:33:01.546651 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:33:01 crc kubenswrapper[4792]: I0318 16:33:01.605431 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mng7"] Mar 18 16:33:01 crc kubenswrapper[4792]: I0318 16:33:01.778389 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9mng7" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="registry-server" containerID="cri-o://3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed" gracePeriod=2 Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.316656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.422886 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84kwc\" (UniqueName: \"kubernetes.io/projected/e59e5e47-020b-4052-9b61-9bbfb477efb5-kube-api-access-84kwc\") pod \"e59e5e47-020b-4052-9b61-9bbfb477efb5\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.423330 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-utilities\") pod \"e59e5e47-020b-4052-9b61-9bbfb477efb5\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.423478 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-catalog-content\") pod \"e59e5e47-020b-4052-9b61-9bbfb477efb5\" (UID: \"e59e5e47-020b-4052-9b61-9bbfb477efb5\") " Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.424163 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-utilities" (OuterVolumeSpecName: "utilities") pod "e59e5e47-020b-4052-9b61-9bbfb477efb5" (UID: "e59e5e47-020b-4052-9b61-9bbfb477efb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.424783 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.432502 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59e5e47-020b-4052-9b61-9bbfb477efb5-kube-api-access-84kwc" (OuterVolumeSpecName: "kube-api-access-84kwc") pod "e59e5e47-020b-4052-9b61-9bbfb477efb5" (UID: "e59e5e47-020b-4052-9b61-9bbfb477efb5"). InnerVolumeSpecName "kube-api-access-84kwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.454548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e59e5e47-020b-4052-9b61-9bbfb477efb5" (UID: "e59e5e47-020b-4052-9b61-9bbfb477efb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.526612 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84kwc\" (UniqueName: \"kubernetes.io/projected/e59e5e47-020b-4052-9b61-9bbfb477efb5-kube-api-access-84kwc\") on node \"crc\" DevicePath \"\"" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.526672 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59e5e47-020b-4052-9b61-9bbfb477efb5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.790685 4792 generic.go:334] "Generic (PLEG): container finished" podID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerID="3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed" exitCode=0 Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.790728 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mng7" event={"ID":"e59e5e47-020b-4052-9b61-9bbfb477efb5","Type":"ContainerDied","Data":"3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed"} Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.790753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mng7" event={"ID":"e59e5e47-020b-4052-9b61-9bbfb477efb5","Type":"ContainerDied","Data":"361f98aa6319fedfd03f89c441dd8a032550aeade422b5cc76bb46248ae03959"} Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.790770 4792 scope.go:117] "RemoveContainer" containerID="3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.790844 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mng7" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.832228 4792 scope.go:117] "RemoveContainer" containerID="9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.836395 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mng7"] Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.854855 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mng7"] Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.858275 4792 scope.go:117] "RemoveContainer" containerID="1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.925308 4792 scope.go:117] "RemoveContainer" containerID="3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed" Mar 18 16:33:02 crc kubenswrapper[4792]: E0318 16:33:02.927341 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed\": container with ID starting with 3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed not found: ID does not exist" containerID="3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.927386 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed"} err="failed to get container status \"3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed\": rpc error: code = NotFound desc = could not find container \"3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed\": container with ID starting with 3a42d875881291978313d48305f1864857ba6ad96b6b82ad8e8a4a7ff0691fed not found: ID does not exist" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.927416 4792 scope.go:117] "RemoveContainer" containerID="9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8" Mar 18 16:33:02 crc kubenswrapper[4792]: E0318 16:33:02.927752 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8\": container with ID starting with 9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8 not found: ID does not exist" containerID="9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.927808 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8"} err="failed to get container status \"9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8\": rpc error: code = NotFound desc = could not find container \"9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8\": container with ID starting with 9e9788c98663c82982dfba615abe2f92cf3036a48db97a992ad8cc26730b2ca8 not found: ID does not exist" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.927842 4792 scope.go:117] "RemoveContainer" containerID="1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621" Mar 18 16:33:02 crc kubenswrapper[4792]: E0318 16:33:02.928924 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621\": container with ID starting with 1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621 not found: ID does not exist" containerID="1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621" Mar 18 16:33:02 crc kubenswrapper[4792]: I0318 16:33:02.929100 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621"} err="failed to get container status \"1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621\": rpc error: code = NotFound desc = could not find container \"1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621\": container with ID starting with 1d6d6f9b3c864b8d8f69b54cd5bb0f630dec8852d5752e3cd2bc2d599f2aa621 not found: ID does not exist" Mar 18 16:33:03 crc kubenswrapper[4792]: I0318 16:33:03.870358 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" path="/var/lib/kubelet/pods/e59e5e47-020b-4052-9b61-9bbfb477efb5/volumes" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.155163 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fpccv"] Mar 18 16:34:00 crc kubenswrapper[4792]: E0318 16:34:00.156167 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="extract-content" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156180 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="extract-content" Mar 18 16:34:00 crc kubenswrapper[4792]: E0318 16:34:00.156208 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="registry-server" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156214 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="registry-server" Mar 18 16:34:00 crc kubenswrapper[4792]: E0318 16:34:00.156244 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="extract-utilities" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156251 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="extract-utilities" Mar 18 16:34:00 crc kubenswrapper[4792]: E0318 16:34:00.156262 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="registry-server" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156267 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="registry-server" Mar 18 16:34:00 crc kubenswrapper[4792]: E0318 16:34:00.156285 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="extract-content" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156290 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="extract-content" Mar 18 16:34:00 crc kubenswrapper[4792]: E0318 16:34:00.156303 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="extract-utilities" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156309 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="extract-utilities" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156530 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59e5e47-020b-4052-9b61-9bbfb477efb5" containerName="registry-server" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.156546 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a131ee-9ed6-4243-b856-c688d4fd9b89" containerName="registry-server" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.157424 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fpccv" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.160288 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.160553 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.166645 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fpccv"] Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.167680 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.284039 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6trs\" (UniqueName: \"kubernetes.io/projected/02939277-0f04-4780-87fe-5dad11793e5c-kube-api-access-k6trs\") pod \"auto-csr-approver-29564194-fpccv\" (UID: \"02939277-0f04-4780-87fe-5dad11793e5c\") " pod="openshift-infra/auto-csr-approver-29564194-fpccv" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.321781 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.321854 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.387019 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6trs\" (UniqueName: \"kubernetes.io/projected/02939277-0f04-4780-87fe-5dad11793e5c-kube-api-access-k6trs\") pod \"auto-csr-approver-29564194-fpccv\" (UID: \"02939277-0f04-4780-87fe-5dad11793e5c\") " pod="openshift-infra/auto-csr-approver-29564194-fpccv" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.415361 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6trs\" (UniqueName: \"kubernetes.io/projected/02939277-0f04-4780-87fe-5dad11793e5c-kube-api-access-k6trs\") pod \"auto-csr-approver-29564194-fpccv\" (UID: \"02939277-0f04-4780-87fe-5dad11793e5c\") " pod="openshift-infra/auto-csr-approver-29564194-fpccv" Mar 18 16:34:00 crc kubenswrapper[4792]: I0318 16:34:00.479569 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fpccv" Mar 18 16:34:01 crc kubenswrapper[4792]: I0318 16:34:00.999545 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fpccv"] Mar 18 16:34:01 crc kubenswrapper[4792]: I0318 16:34:01.475000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-fpccv" event={"ID":"02939277-0f04-4780-87fe-5dad11793e5c","Type":"ContainerStarted","Data":"2c2132e60153487f31c16e45b708104387434a5dcda3372a4dce91adc9776e39"} Mar 18 16:34:02 crc kubenswrapper[4792]: I0318 16:34:02.501504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-fpccv" event={"ID":"02939277-0f04-4780-87fe-5dad11793e5c","Type":"ContainerStarted","Data":"6fd8c20e66eb249409ec65d2f867c21cb8eecf0c8cbdb17cb7137010ca10c1a3"} Mar 18 16:34:02 crc kubenswrapper[4792]: I0318 16:34:02.533573 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564194-fpccv" podStartSLOduration=1.407931506 podStartE2EDuration="2.533547751s" podCreationTimestamp="2026-03-18 16:34:00 +0000 UTC" firstStartedPulling="2026-03-18 16:34:01.005723887 +0000 UTC m=+3589.875052824" lastFinishedPulling="2026-03-18 16:34:02.131340132 +0000 UTC m=+3591.000669069" observedRunningTime="2026-03-18 16:34:02.521130332 +0000 UTC m=+3591.390459279" watchObservedRunningTime="2026-03-18 16:34:02.533547751 +0000 UTC m=+3591.402876688" Mar 18 16:34:03 crc kubenswrapper[4792]: I0318 16:34:03.517596 4792 generic.go:334] "Generic (PLEG): container finished" podID="02939277-0f04-4780-87fe-5dad11793e5c" containerID="6fd8c20e66eb249409ec65d2f867c21cb8eecf0c8cbdb17cb7137010ca10c1a3" exitCode=0 Mar 18 16:34:03 crc kubenswrapper[4792]: I0318 16:34:03.517750 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-fpccv" event={"ID":"02939277-0f04-4780-87fe-5dad11793e5c","Type":"ContainerDied","Data":"6fd8c20e66eb249409ec65d2f867c21cb8eecf0c8cbdb17cb7137010ca10c1a3"} Mar 18 16:34:05 crc kubenswrapper[4792]: I0318 16:34:05.032146 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fpccv" Mar 18 16:34:05 crc kubenswrapper[4792]: I0318 16:34:05.113396 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6trs\" (UniqueName: \"kubernetes.io/projected/02939277-0f04-4780-87fe-5dad11793e5c-kube-api-access-k6trs\") pod \"02939277-0f04-4780-87fe-5dad11793e5c\" (UID: \"02939277-0f04-4780-87fe-5dad11793e5c\") " Mar 18 16:34:05 crc kubenswrapper[4792]: I0318 16:34:05.123023 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02939277-0f04-4780-87fe-5dad11793e5c-kube-api-access-k6trs" (OuterVolumeSpecName: "kube-api-access-k6trs") pod "02939277-0f04-4780-87fe-5dad11793e5c" (UID: "02939277-0f04-4780-87fe-5dad11793e5c"). InnerVolumeSpecName "kube-api-access-k6trs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:34:05 crc kubenswrapper[4792]: I0318 16:34:05.216729 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6trs\" (UniqueName: \"kubernetes.io/projected/02939277-0f04-4780-87fe-5dad11793e5c-kube-api-access-k6trs\") on node \"crc\" DevicePath \"\"" Mar 18 16:34:05 crc kubenswrapper[4792]: I0318 16:34:05.544173 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564194-fpccv" event={"ID":"02939277-0f04-4780-87fe-5dad11793e5c","Type":"ContainerDied","Data":"2c2132e60153487f31c16e45b708104387434a5dcda3372a4dce91adc9776e39"} Mar 18 16:34:05 crc kubenswrapper[4792]: I0318 16:34:05.544498 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2132e60153487f31c16e45b708104387434a5dcda3372a4dce91adc9776e39" Mar 18 16:34:05 crc kubenswrapper[4792]: I0318 16:34:05.544234 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564194-fpccv" Mar 18 16:34:06 crc kubenswrapper[4792]: I0318 16:34:06.123922 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-q4mrh"] Mar 18 16:34:06 crc kubenswrapper[4792]: I0318 16:34:06.138920 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564188-q4mrh"] Mar 18 16:34:07 crc kubenswrapper[4792]: I0318 16:34:07.868765 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1459d206-6e86-4f94-9c2c-03ad754c89b5" path="/var/lib/kubelet/pods/1459d206-6e86-4f94-9c2c-03ad754c89b5/volumes" Mar 18 16:34:30 crc kubenswrapper[4792]: I0318 16:34:30.322350 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:34:30 crc kubenswrapper[4792]: I0318 16:34:30.323064 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:34:50 crc kubenswrapper[4792]: I0318 16:34:50.512192 4792 scope.go:117] "RemoveContainer" containerID="54405b664c580b070a52e89abcde2d8c292b197a715727682e379404aa4c13fe" Mar 18 16:35:00 crc kubenswrapper[4792]: I0318 16:35:00.321882 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:35:00 crc kubenswrapper[4792]: I0318 16:35:00.322496 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:35:00 crc kubenswrapper[4792]: I0318 16:35:00.322545 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:35:00 crc kubenswrapper[4792]: I0318 16:35:00.323579 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:35:00 crc kubenswrapper[4792]: I0318 16:35:00.323636 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" gracePeriod=600 Mar 18 16:35:00 crc kubenswrapper[4792]: E0318 16:35:00.462359 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:35:01 crc kubenswrapper[4792]: I0318 16:35:01.161302 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" exitCode=0 Mar 18 16:35:01 crc kubenswrapper[4792]: I0318 16:35:01.161459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7"} Mar 18 16:35:01 crc kubenswrapper[4792]: I0318 16:35:01.161608 4792 scope.go:117] "RemoveContainer" containerID="322f1fc2f95037f713b3fbe6f082e8c10e13e85495957fd2f319f4c0d42b5a1c" Mar 18 16:35:01 crc kubenswrapper[4792]: I0318 16:35:01.162502 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:35:01 crc kubenswrapper[4792]: E0318 16:35:01.163035 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:35:05 crc kubenswrapper[4792]: I0318 16:35:05.018404 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-7984d74779-bxqk7" podUID="e19c85aa-c5a1-4d0d-99ff-cc9283e5252f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.465578 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pwjtz"] Mar 18 16:35:10 crc kubenswrapper[4792]: E0318 16:35:10.469213 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02939277-0f04-4780-87fe-5dad11793e5c" containerName="oc" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.469240 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="02939277-0f04-4780-87fe-5dad11793e5c" containerName="oc" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.469555 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="02939277-0f04-4780-87fe-5dad11793e5c" containerName="oc" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.471606 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.498236 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwjtz"] Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.611221 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-utilities\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.611320 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxqv\" (UniqueName: \"kubernetes.io/projected/09466b2e-b090-4ae6-b74a-b3a35836535b-kube-api-access-4cxqv\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.611591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-catalog-content\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.714380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxqv\" (UniqueName: \"kubernetes.io/projected/09466b2e-b090-4ae6-b74a-b3a35836535b-kube-api-access-4cxqv\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.714556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-catalog-content\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.714724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-utilities\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.715283 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-catalog-content\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.715299 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-utilities\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.736162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxqv\" (UniqueName: \"kubernetes.io/projected/09466b2e-b090-4ae6-b74a-b3a35836535b-kube-api-access-4cxqv\") pod \"community-operators-pwjtz\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:10 crc kubenswrapper[4792]: I0318 16:35:10.795657 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:11 crc kubenswrapper[4792]: I0318 16:35:11.381418 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwjtz"] Mar 18 16:35:12 crc kubenswrapper[4792]: I0318 16:35:12.287334 4792 generic.go:334] "Generic (PLEG): container finished" podID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerID="3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc" exitCode=0 Mar 18 16:35:12 crc kubenswrapper[4792]: I0318 16:35:12.287411 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjtz" event={"ID":"09466b2e-b090-4ae6-b74a-b3a35836535b","Type":"ContainerDied","Data":"3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc"} Mar 18 16:35:12 crc kubenswrapper[4792]: I0318 16:35:12.287815 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjtz" event={"ID":"09466b2e-b090-4ae6-b74a-b3a35836535b","Type":"ContainerStarted","Data":"a3eaef46e9ed04202ff7810dcad54822d8f2775a5b18793f69640c29778c375d"} Mar 18 16:35:12 crc kubenswrapper[4792]: I0318 16:35:12.290434 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:35:12 crc kubenswrapper[4792]: I0318 16:35:12.854502 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:35:12 crc kubenswrapper[4792]: E0318 16:35:12.855107 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:35:14 crc kubenswrapper[4792]: I0318 16:35:14.323365 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjtz" event={"ID":"09466b2e-b090-4ae6-b74a-b3a35836535b","Type":"ContainerStarted","Data":"1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93"} Mar 18 16:35:16 crc kubenswrapper[4792]: I0318 16:35:16.344631 4792 generic.go:334] "Generic (PLEG): container finished" podID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerID="1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93" exitCode=0 Mar 18 16:35:16 crc kubenswrapper[4792]: I0318 16:35:16.345181 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjtz" event={"ID":"09466b2e-b090-4ae6-b74a-b3a35836535b","Type":"ContainerDied","Data":"1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93"} Mar 18 16:35:17 crc kubenswrapper[4792]: I0318 16:35:17.369873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjtz" event={"ID":"09466b2e-b090-4ae6-b74a-b3a35836535b","Type":"ContainerStarted","Data":"e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc"} Mar 18 16:35:17 crc kubenswrapper[4792]: I0318 16:35:17.399742 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pwjtz" podStartSLOduration=2.728196471 podStartE2EDuration="7.399718836s" podCreationTimestamp="2026-03-18 16:35:10 +0000 UTC" firstStartedPulling="2026-03-18 16:35:12.290225199 +0000 UTC m=+3661.159554136" lastFinishedPulling="2026-03-18 16:35:16.961747574 +0000 UTC m=+3665.831076501" observedRunningTime="2026-03-18 16:35:17.391823137 +0000 UTC m=+3666.261152084" watchObservedRunningTime="2026-03-18 16:35:17.399718836 +0000 UTC m=+3666.269047773" Mar 18 16:35:20 crc kubenswrapper[4792]: I0318 16:35:20.796122 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:20 crc kubenswrapper[4792]: I0318 16:35:20.797144 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:20 crc kubenswrapper[4792]: I0318 16:35:20.855859 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:26 crc kubenswrapper[4792]: I0318 16:35:26.854265 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:35:26 crc kubenswrapper[4792]: E0318 16:35:26.855130 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:35:30 crc kubenswrapper[4792]: I0318 16:35:30.851112 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:30 crc kubenswrapper[4792]: I0318 16:35:30.918155 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwjtz"] Mar 18 16:35:31 crc kubenswrapper[4792]: I0318 16:35:31.523664 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pwjtz" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="registry-server" containerID="cri-o://e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc" gracePeriod=2 Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.169687 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.354545 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-utilities\") pod \"09466b2e-b090-4ae6-b74a-b3a35836535b\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.356012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cxqv\" (UniqueName: \"kubernetes.io/projected/09466b2e-b090-4ae6-b74a-b3a35836535b-kube-api-access-4cxqv\") pod \"09466b2e-b090-4ae6-b74a-b3a35836535b\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.356145 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-catalog-content\") pod \"09466b2e-b090-4ae6-b74a-b3a35836535b\" (UID: \"09466b2e-b090-4ae6-b74a-b3a35836535b\") " Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.356219 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-utilities" (OuterVolumeSpecName: "utilities") pod "09466b2e-b090-4ae6-b74a-b3a35836535b" (UID: "09466b2e-b090-4ae6-b74a-b3a35836535b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.359543 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.364006 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09466b2e-b090-4ae6-b74a-b3a35836535b-kube-api-access-4cxqv" (OuterVolumeSpecName: "kube-api-access-4cxqv") pod "09466b2e-b090-4ae6-b74a-b3a35836535b" (UID: "09466b2e-b090-4ae6-b74a-b3a35836535b"). InnerVolumeSpecName "kube-api-access-4cxqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.414563 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09466b2e-b090-4ae6-b74a-b3a35836535b" (UID: "09466b2e-b090-4ae6-b74a-b3a35836535b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.462160 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09466b2e-b090-4ae6-b74a-b3a35836535b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.462202 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cxqv\" (UniqueName: \"kubernetes.io/projected/09466b2e-b090-4ae6-b74a-b3a35836535b-kube-api-access-4cxqv\") on node \"crc\" DevicePath \"\"" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.544643 4792 generic.go:334] "Generic (PLEG): container finished" podID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerID="e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc" exitCode=0 Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.544718 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjtz" event={"ID":"09466b2e-b090-4ae6-b74a-b3a35836535b","Type":"ContainerDied","Data":"e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc"} Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.545096 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjtz" event={"ID":"09466b2e-b090-4ae6-b74a-b3a35836535b","Type":"ContainerDied","Data":"a3eaef46e9ed04202ff7810dcad54822d8f2775a5b18793f69640c29778c375d"} Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.545124 4792 scope.go:117] "RemoveContainer" containerID="e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.545231 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjtz" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.584235 4792 scope.go:117] "RemoveContainer" containerID="1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.599528 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwjtz"] Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.613950 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pwjtz"] Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.618178 4792 scope.go:117] "RemoveContainer" containerID="3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.682399 4792 scope.go:117] "RemoveContainer" containerID="e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc" Mar 18 16:35:32 crc kubenswrapper[4792]: E0318 16:35:32.682907 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc\": container with ID starting with e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc not found: ID does not exist" containerID="e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.682938 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc"} err="failed to get container status \"e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc\": rpc error: code = NotFound desc = could not find container \"e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc\": container with ID starting with e1d22e8bf634ea9063982e7e6a3dc0271e8c225b80f40d4b9bb5a0f5206cd1dc not found: ID does not exist" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.682962 4792 scope.go:117] "RemoveContainer" containerID="1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93" Mar 18 16:35:32 crc kubenswrapper[4792]: E0318 16:35:32.683356 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93\": container with ID starting with 1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93 not found: ID does not exist" containerID="1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.683382 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93"} err="failed to get container status \"1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93\": rpc error: code = NotFound desc = could not find container \"1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93\": container with ID starting with 1b17a6ce1db440bb59b08eab2b3a13f64e668356730a7f8c6ce545b506069d93 not found: ID does not exist" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.683395 4792 scope.go:117] "RemoveContainer" containerID="3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc" Mar 18 16:35:32 crc kubenswrapper[4792]: E0318 16:35:32.683648 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc\": container with ID starting with 3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc not found: ID does not exist" containerID="3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc" Mar 18 16:35:32 crc kubenswrapper[4792]: I0318 16:35:32.683668 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc"} err="failed to get container status \"3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc\": rpc error: code = NotFound desc = could not find container \"3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc\": container with ID starting with 3c0955c81b9cfd308646a2154a16a0e2ee3a484a5015fa7519433e2202c224fc not found: ID does not exist" Mar 18 16:35:33 crc kubenswrapper[4792]: I0318 16:35:33.869792 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" path="/var/lib/kubelet/pods/09466b2e-b090-4ae6-b74a-b3a35836535b/volumes" Mar 18 16:35:40 crc kubenswrapper[4792]: I0318 16:35:40.854501 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:35:40 crc kubenswrapper[4792]: E0318 16:35:40.856476 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:35:54 crc kubenswrapper[4792]: I0318 16:35:54.863397 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:35:54 crc kubenswrapper[4792]: E0318 16:35:54.864410 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.162381 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564196-5pb4t"] Mar 18 16:36:00 crc kubenswrapper[4792]: E0318 16:36:00.163250 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.163262 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4792]: E0318 16:36:00.163290 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="extract-content" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.163296 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="extract-content" Mar 18 16:36:00 crc kubenswrapper[4792]: E0318 16:36:00.163323 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="extract-utilities" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.163330 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="extract-utilities" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.163569 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="09466b2e-b090-4ae6-b74a-b3a35836535b" containerName="registry-server" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.164389 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.171878 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.176276 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.178635 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.180838 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-5pb4t"] Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.193106 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg4r7\" (UniqueName: \"kubernetes.io/projected/fbef5a02-ad3a-472e-b026-1151782480f9-kube-api-access-wg4r7\") pod \"auto-csr-approver-29564196-5pb4t\" (UID: \"fbef5a02-ad3a-472e-b026-1151782480f9\") " pod="openshift-infra/auto-csr-approver-29564196-5pb4t" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.295160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg4r7\" (UniqueName: \"kubernetes.io/projected/fbef5a02-ad3a-472e-b026-1151782480f9-kube-api-access-wg4r7\") pod \"auto-csr-approver-29564196-5pb4t\" (UID: \"fbef5a02-ad3a-472e-b026-1151782480f9\") " pod="openshift-infra/auto-csr-approver-29564196-5pb4t" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.320149 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg4r7\" (UniqueName: \"kubernetes.io/projected/fbef5a02-ad3a-472e-b026-1151782480f9-kube-api-access-wg4r7\") pod \"auto-csr-approver-29564196-5pb4t\" (UID: \"fbef5a02-ad3a-472e-b026-1151782480f9\") " pod="openshift-infra/auto-csr-approver-29564196-5pb4t" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.488065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" Mar 18 16:36:00 crc kubenswrapper[4792]: I0318 16:36:00.988899 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-5pb4t"] Mar 18 16:36:01 crc kubenswrapper[4792]: I0318 16:36:01.872224 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" event={"ID":"fbef5a02-ad3a-472e-b026-1151782480f9","Type":"ContainerStarted","Data":"fabd446d89790fe14d4537c3e8f0e7576558b027f5394ae46883fd3e42ee5fc2"} Mar 18 16:36:02 crc kubenswrapper[4792]: I0318 16:36:02.894514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" event={"ID":"fbef5a02-ad3a-472e-b026-1151782480f9","Type":"ContainerStarted","Data":"858bde005097f9d76fd861e0958c9f16acc968dd662a90dcdcc276c1a0cd2863"} Mar 18 16:36:02 crc kubenswrapper[4792]: I0318 16:36:02.919228 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" podStartSLOduration=1.7484962 podStartE2EDuration="2.919203235s" podCreationTimestamp="2026-03-18 16:36:00 +0000 UTC" firstStartedPulling="2026-03-18 16:36:00.996355544 +0000 UTC m=+3709.865684481" lastFinishedPulling="2026-03-18 16:36:02.167062579 +0000 UTC m=+3711.036391516" observedRunningTime="2026-03-18 16:36:02.909480167 +0000 UTC m=+3711.778809114" watchObservedRunningTime="2026-03-18 16:36:02.919203235 +0000 UTC m=+3711.788532172" Mar 18 16:36:03 crc kubenswrapper[4792]: I0318 16:36:03.904528 4792 generic.go:334] "Generic (PLEG): container finished" podID="fbef5a02-ad3a-472e-b026-1151782480f9" containerID="858bde005097f9d76fd861e0958c9f16acc968dd662a90dcdcc276c1a0cd2863" exitCode=0 Mar 18 16:36:03 crc kubenswrapper[4792]: I0318 16:36:03.904841 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" event={"ID":"fbef5a02-ad3a-472e-b026-1151782480f9","Type":"ContainerDied","Data":"858bde005097f9d76fd861e0958c9f16acc968dd662a90dcdcc276c1a0cd2863"} Mar 18 16:36:05 crc kubenswrapper[4792]: I0318 16:36:05.344576 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" Mar 18 16:36:05 crc kubenswrapper[4792]: I0318 16:36:05.526153 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg4r7\" (UniqueName: \"kubernetes.io/projected/fbef5a02-ad3a-472e-b026-1151782480f9-kube-api-access-wg4r7\") pod \"fbef5a02-ad3a-472e-b026-1151782480f9\" (UID: \"fbef5a02-ad3a-472e-b026-1151782480f9\") " Mar 18 16:36:05 crc kubenswrapper[4792]: I0318 16:36:05.544136 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbef5a02-ad3a-472e-b026-1151782480f9-kube-api-access-wg4r7" (OuterVolumeSpecName: "kube-api-access-wg4r7") pod "fbef5a02-ad3a-472e-b026-1151782480f9" (UID: "fbef5a02-ad3a-472e-b026-1151782480f9"). InnerVolumeSpecName "kube-api-access-wg4r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:36:05 crc kubenswrapper[4792]: I0318 16:36:05.630311 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg4r7\" (UniqueName: \"kubernetes.io/projected/fbef5a02-ad3a-472e-b026-1151782480f9-kube-api-access-wg4r7\") on node \"crc\" DevicePath \"\"" Mar 18 16:36:05 crc kubenswrapper[4792]: I0318 16:36:05.949075 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" event={"ID":"fbef5a02-ad3a-472e-b026-1151782480f9","Type":"ContainerDied","Data":"fabd446d89790fe14d4537c3e8f0e7576558b027f5394ae46883fd3e42ee5fc2"} Mar 18 16:36:05 crc kubenswrapper[4792]: I0318 16:36:05.949441 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fabd446d89790fe14d4537c3e8f0e7576558b027f5394ae46883fd3e42ee5fc2" Mar 18 16:36:05 crc kubenswrapper[4792]: I0318 16:36:05.949512 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564196-5pb4t" Mar 18 16:36:06 crc kubenswrapper[4792]: I0318 16:36:06.044794 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-td8gt"] Mar 18 16:36:06 crc kubenswrapper[4792]: I0318 16:36:06.056846 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564190-td8gt"] Mar 18 16:36:07 crc kubenswrapper[4792]: I0318 16:36:07.870663 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e6bf8a-3133-4a80-b99f-605b2a7b15e0" path="/var/lib/kubelet/pods/f4e6bf8a-3133-4a80-b99f-605b2a7b15e0/volumes" Mar 18 16:36:09 crc kubenswrapper[4792]: I0318 16:36:09.856865 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:36:09 crc kubenswrapper[4792]: E0318 16:36:09.858155 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.379602 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nh94b"] Mar 18 16:36:21 crc kubenswrapper[4792]: E0318 16:36:21.380876 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbef5a02-ad3a-472e-b026-1151782480f9" containerName="oc" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.380896 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbef5a02-ad3a-472e-b026-1151782480f9" containerName="oc" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.381262 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbef5a02-ad3a-472e-b026-1151782480f9" containerName="oc" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.383609 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.394693 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nh94b"] Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.571287 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-catalog-content\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.571915 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-utilities\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.572083 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr9cc\" (UniqueName: \"kubernetes.io/projected/fa7f2856-aea2-40bb-8ed5-597c8193791f-kube-api-access-fr9cc\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.679751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-catalog-content\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.680034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-utilities\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.680161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9cc\" (UniqueName: \"kubernetes.io/projected/fa7f2856-aea2-40bb-8ed5-597c8193791f-kube-api-access-fr9cc\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.681344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-catalog-content\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.681765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-utilities\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:21 crc kubenswrapper[4792]: I0318 16:36:21.731823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr9cc\" (UniqueName: \"kubernetes.io/projected/fa7f2856-aea2-40bb-8ed5-597c8193791f-kube-api-access-fr9cc\") pod \"certified-operators-nh94b\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:22 crc kubenswrapper[4792]: I0318 16:36:22.022648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:22 crc kubenswrapper[4792]: I0318 16:36:22.786755 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nh94b"] Mar 18 16:36:22 crc kubenswrapper[4792]: I0318 16:36:22.856764 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:36:22 crc kubenswrapper[4792]: E0318 16:36:22.857303 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:36:23 crc kubenswrapper[4792]: I0318 16:36:23.145431 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerID="1a134abb404200294c5fb13c59032e7169873400b5650d5bdd107f26c11830b4" exitCode=0 Mar 18 16:36:23 crc kubenswrapper[4792]: I0318 16:36:23.145543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh94b" event={"ID":"fa7f2856-aea2-40bb-8ed5-597c8193791f","Type":"ContainerDied","Data":"1a134abb404200294c5fb13c59032e7169873400b5650d5bdd107f26c11830b4"} Mar 18 16:36:23 crc kubenswrapper[4792]: I0318 16:36:23.146067 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh94b" event={"ID":"fa7f2856-aea2-40bb-8ed5-597c8193791f","Type":"ContainerStarted","Data":"4f99b760de9b61c631e68d5f1bbed61d4f3368ea84adaf428eb09d950ab6f34b"} Mar 18 16:36:24 crc kubenswrapper[4792]: I0318 16:36:24.159263 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh94b" event={"ID":"fa7f2856-aea2-40bb-8ed5-597c8193791f","Type":"ContainerStarted","Data":"be09814ad939e835f176254d758f0da400581067aa1bfaae63eddbabbc594eb9"} Mar 18 16:36:26 crc kubenswrapper[4792]: I0318 16:36:26.182724 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerID="be09814ad939e835f176254d758f0da400581067aa1bfaae63eddbabbc594eb9" exitCode=0 Mar 18 16:36:26 crc kubenswrapper[4792]: I0318 16:36:26.182808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh94b" event={"ID":"fa7f2856-aea2-40bb-8ed5-597c8193791f","Type":"ContainerDied","Data":"be09814ad939e835f176254d758f0da400581067aa1bfaae63eddbabbc594eb9"} Mar 18 16:36:27 crc kubenswrapper[4792]: I0318 16:36:27.197463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh94b" event={"ID":"fa7f2856-aea2-40bb-8ed5-597c8193791f","Type":"ContainerStarted","Data":"058f9596a57e0ced577e930864c190f7f0f2dcc38193f4a89568301dc8f7a449"} Mar 18 16:36:27 crc kubenswrapper[4792]: I0318 16:36:27.224398 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nh94b" podStartSLOduration=2.750282389 podStartE2EDuration="6.224372044s" podCreationTimestamp="2026-03-18 16:36:21 +0000 UTC" firstStartedPulling="2026-03-18 16:36:23.149173334 +0000 UTC m=+3732.018502271" lastFinishedPulling="2026-03-18 16:36:26.623262969 +0000 UTC m=+3735.492591926" observedRunningTime="2026-03-18 16:36:27.217837248 +0000 UTC m=+3736.087166195" watchObservedRunningTime="2026-03-18 16:36:27.224372044 +0000 UTC m=+3736.093700991" Mar 18 16:36:32 crc kubenswrapper[4792]: I0318 16:36:32.023295 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:32 crc kubenswrapper[4792]: I0318 16:36:32.023900 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:32 crc kubenswrapper[4792]: I0318 16:36:32.100517 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:32 crc kubenswrapper[4792]: I0318 16:36:32.294283 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:32 crc kubenswrapper[4792]: I0318 16:36:32.353015 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nh94b"] Mar 18 16:36:34 crc kubenswrapper[4792]: I0318 16:36:34.273762 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nh94b" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="registry-server" containerID="cri-o://058f9596a57e0ced577e930864c190f7f0f2dcc38193f4a89568301dc8f7a449" gracePeriod=2 Mar 18 16:36:35 crc kubenswrapper[4792]: I0318 16:36:35.296423 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerID="058f9596a57e0ced577e930864c190f7f0f2dcc38193f4a89568301dc8f7a449" exitCode=0 Mar 18 16:36:35 crc kubenswrapper[4792]: I0318 16:36:35.296489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh94b" event={"ID":"fa7f2856-aea2-40bb-8ed5-597c8193791f","Type":"ContainerDied","Data":"058f9596a57e0ced577e930864c190f7f0f2dcc38193f4a89568301dc8f7a449"} Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.142368 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.251494 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-utilities\") pod \"fa7f2856-aea2-40bb-8ed5-597c8193791f\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.251901 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-catalog-content\") pod \"fa7f2856-aea2-40bb-8ed5-597c8193791f\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.252090 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr9cc\" (UniqueName: \"kubernetes.io/projected/fa7f2856-aea2-40bb-8ed5-597c8193791f-kube-api-access-fr9cc\") pod \"fa7f2856-aea2-40bb-8ed5-597c8193791f\" (UID: \"fa7f2856-aea2-40bb-8ed5-597c8193791f\") " Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.252830 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-utilities" (OuterVolumeSpecName: "utilities") pod "fa7f2856-aea2-40bb-8ed5-597c8193791f" (UID: "fa7f2856-aea2-40bb-8ed5-597c8193791f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.253933 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.261207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7f2856-aea2-40bb-8ed5-597c8193791f-kube-api-access-fr9cc" (OuterVolumeSpecName: "kube-api-access-fr9cc") pod "fa7f2856-aea2-40bb-8ed5-597c8193791f" (UID: "fa7f2856-aea2-40bb-8ed5-597c8193791f"). InnerVolumeSpecName "kube-api-access-fr9cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.313316 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa7f2856-aea2-40bb-8ed5-597c8193791f" (UID: "fa7f2856-aea2-40bb-8ed5-597c8193791f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.319246 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh94b" event={"ID":"fa7f2856-aea2-40bb-8ed5-597c8193791f","Type":"ContainerDied","Data":"4f99b760de9b61c631e68d5f1bbed61d4f3368ea84adaf428eb09d950ab6f34b"} Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.319343 4792 scope.go:117] "RemoveContainer" containerID="058f9596a57e0ced577e930864c190f7f0f2dcc38193f4a89568301dc8f7a449" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.319349 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh94b" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.363269 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa7f2856-aea2-40bb-8ed5-597c8193791f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.363729 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr9cc\" (UniqueName: \"kubernetes.io/projected/fa7f2856-aea2-40bb-8ed5-597c8193791f-kube-api-access-fr9cc\") on node \"crc\" DevicePath \"\"" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.371153 4792 scope.go:117] "RemoveContainer" containerID="be09814ad939e835f176254d758f0da400581067aa1bfaae63eddbabbc594eb9" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.386638 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nh94b"] Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.409809 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nh94b"] Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.418613 4792 scope.go:117] "RemoveContainer" containerID="1a134abb404200294c5fb13c59032e7169873400b5650d5bdd107f26c11830b4" Mar 18 16:36:36 crc kubenswrapper[4792]: I0318 16:36:36.854602 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:36:36 crc kubenswrapper[4792]: E0318 16:36:36.855147 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:36:37 crc kubenswrapper[4792]: I0318 16:36:37.868411 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" path="/var/lib/kubelet/pods/fa7f2856-aea2-40bb-8ed5-597c8193791f/volumes" Mar 18 16:36:49 crc kubenswrapper[4792]: I0318 16:36:49.855382 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:36:49 crc kubenswrapper[4792]: E0318 16:36:49.856357 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:36:50 crc kubenswrapper[4792]: I0318 16:36:50.646254 4792 scope.go:117] "RemoveContainer" containerID="4ddc4469a1c2630c2352d2e4b873fbef7044344333742ecf05cd3b7557465ab2" Mar 18 16:37:03 crc kubenswrapper[4792]: I0318 16:37:03.855220 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:37:03 crc kubenswrapper[4792]: E0318 16:37:03.856002 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:37:14 crc kubenswrapper[4792]: I0318 16:37:14.855009 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:37:14 crc kubenswrapper[4792]: E0318 16:37:14.856204 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:37:27 crc kubenswrapper[4792]: I0318 16:37:27.855245 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:37:27 crc kubenswrapper[4792]: E0318 16:37:27.856566 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:37:40 crc kubenswrapper[4792]: I0318 16:37:40.854501 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:37:40 crc kubenswrapper[4792]: E0318 16:37:40.855340 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:37:54 crc kubenswrapper[4792]: I0318 16:37:54.854803 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:37:54 crc kubenswrapper[4792]: E0318 16:37:54.855897 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.166903 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564198-9bhb2"] Mar 18 16:38:00 crc kubenswrapper[4792]: E0318 16:38:00.168773 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="extract-content" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.168793 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="extract-content" Mar 18 16:38:00 crc kubenswrapper[4792]: E0318 16:38:00.168813 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="registry-server" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.168819 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="registry-server" Mar 18 16:38:00 crc kubenswrapper[4792]: E0318 16:38:00.168851 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="extract-utilities" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.168863 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="extract-utilities" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.169167 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7f2856-aea2-40bb-8ed5-597c8193791f" containerName="registry-server" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.170495 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.173042 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.173275 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.173646 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.182023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-9bhb2"] Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.301079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6frj\" (UniqueName: \"kubernetes.io/projected/48cd98a5-46ee-419c-9cb6-def10cca9727-kube-api-access-w6frj\") pod \"auto-csr-approver-29564198-9bhb2\" (UID: \"48cd98a5-46ee-419c-9cb6-def10cca9727\") " pod="openshift-infra/auto-csr-approver-29564198-9bhb2" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.404044 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6frj\" (UniqueName: \"kubernetes.io/projected/48cd98a5-46ee-419c-9cb6-def10cca9727-kube-api-access-w6frj\") pod \"auto-csr-approver-29564198-9bhb2\" (UID: \"48cd98a5-46ee-419c-9cb6-def10cca9727\") " pod="openshift-infra/auto-csr-approver-29564198-9bhb2" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.424803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6frj\" (UniqueName: \"kubernetes.io/projected/48cd98a5-46ee-419c-9cb6-def10cca9727-kube-api-access-w6frj\") pod \"auto-csr-approver-29564198-9bhb2\" (UID: \"48cd98a5-46ee-419c-9cb6-def10cca9727\") " pod="openshift-infra/auto-csr-approver-29564198-9bhb2" Mar 18 16:38:00 crc kubenswrapper[4792]: I0318 16:38:00.493617 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" Mar 18 16:38:01 crc kubenswrapper[4792]: I0318 16:38:01.005173 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-9bhb2"] Mar 18 16:38:01 crc kubenswrapper[4792]: I0318 16:38:01.295956 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" event={"ID":"48cd98a5-46ee-419c-9cb6-def10cca9727","Type":"ContainerStarted","Data":"6d1e46ec5469e7e8e93268029a477e9037048e223dc5801fcf3e586bf6956685"} Mar 18 16:38:03 crc kubenswrapper[4792]: I0318 16:38:03.324852 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" event={"ID":"48cd98a5-46ee-419c-9cb6-def10cca9727","Type":"ContainerStarted","Data":"d85f248843d8e50213742db284e592c4601f2d22183ecb2a6110297b88693917"} Mar 18 16:38:03 crc kubenswrapper[4792]: I0318 16:38:03.359873 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" podStartSLOduration=1.6420844570000002 podStartE2EDuration="3.35985185s" podCreationTimestamp="2026-03-18 16:38:00 +0000 UTC" firstStartedPulling="2026-03-18 16:38:01.008407892 +0000 UTC m=+3829.877736829" lastFinishedPulling="2026-03-18 16:38:02.726175275 +0000 UTC m=+3831.595504222" observedRunningTime="2026-03-18 16:38:03.343340698 +0000 UTC m=+3832.212669645" watchObservedRunningTime="2026-03-18 16:38:03.35985185 +0000 UTC m=+3832.229180797" Mar 18 16:38:04 crc kubenswrapper[4792]: I0318 16:38:04.340044 4792 generic.go:334] "Generic (PLEG): container finished" podID="48cd98a5-46ee-419c-9cb6-def10cca9727" containerID="d85f248843d8e50213742db284e592c4601f2d22183ecb2a6110297b88693917" exitCode=0 Mar 18 16:38:04 crc kubenswrapper[4792]: I0318 16:38:04.340095 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" event={"ID":"48cd98a5-46ee-419c-9cb6-def10cca9727","Type":"ContainerDied","Data":"d85f248843d8e50213742db284e592c4601f2d22183ecb2a6110297b88693917"} Mar 18 16:38:05 crc kubenswrapper[4792]: I0318 16:38:05.798217 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" Mar 18 16:38:05 crc kubenswrapper[4792]: I0318 16:38:05.859797 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:38:05 crc kubenswrapper[4792]: E0318 16:38:05.860476 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:38:05 crc kubenswrapper[4792]: I0318 16:38:05.888324 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6frj\" (UniqueName: \"kubernetes.io/projected/48cd98a5-46ee-419c-9cb6-def10cca9727-kube-api-access-w6frj\") pod \"48cd98a5-46ee-419c-9cb6-def10cca9727\" (UID: \"48cd98a5-46ee-419c-9cb6-def10cca9727\") " Mar 18 16:38:05 crc kubenswrapper[4792]: I0318 16:38:05.895782 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cd98a5-46ee-419c-9cb6-def10cca9727-kube-api-access-w6frj" (OuterVolumeSpecName: "kube-api-access-w6frj") pod "48cd98a5-46ee-419c-9cb6-def10cca9727" (UID: "48cd98a5-46ee-419c-9cb6-def10cca9727"). InnerVolumeSpecName "kube-api-access-w6frj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:38:05 crc kubenswrapper[4792]: I0318 16:38:05.993760 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6frj\" (UniqueName: \"kubernetes.io/projected/48cd98a5-46ee-419c-9cb6-def10cca9727-kube-api-access-w6frj\") on node \"crc\" DevicePath \"\"" Mar 18 16:38:06 crc kubenswrapper[4792]: I0318 16:38:06.366271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" event={"ID":"48cd98a5-46ee-419c-9cb6-def10cca9727","Type":"ContainerDied","Data":"6d1e46ec5469e7e8e93268029a477e9037048e223dc5801fcf3e586bf6956685"} Mar 18 16:38:06 crc kubenswrapper[4792]: I0318 16:38:06.366320 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1e46ec5469e7e8e93268029a477e9037048e223dc5801fcf3e586bf6956685" Mar 18 16:38:06 crc kubenswrapper[4792]: I0318 16:38:06.366398 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564198-9bhb2" Mar 18 16:38:06 crc kubenswrapper[4792]: I0318 16:38:06.433170 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-zfrht"] Mar 18 16:38:06 crc kubenswrapper[4792]: I0318 16:38:06.444856 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564192-zfrht"] Mar 18 16:38:07 crc kubenswrapper[4792]: I0318 16:38:07.877287 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33676ab0-bcc8-413d-9f28-7c6264a3cb5c" path="/var/lib/kubelet/pods/33676ab0-bcc8-413d-9f28-7c6264a3cb5c/volumes" Mar 18 16:38:16 crc kubenswrapper[4792]: I0318 16:38:16.853855 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:38:16 crc kubenswrapper[4792]: E0318 16:38:16.854587 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:38:29 crc kubenswrapper[4792]: I0318 16:38:29.857271 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:38:29 crc kubenswrapper[4792]: E0318 16:38:29.858738 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:38:40 crc kubenswrapper[4792]: I0318 16:38:40.855345 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:38:40 crc kubenswrapper[4792]: E0318 16:38:40.857419 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:38:50 crc kubenswrapper[4792]: I0318 16:38:50.762415 4792 scope.go:117] "RemoveContainer" containerID="5d2ef11206240159da58db53b302afebc6770cc0e1085364bf1c95af9c100f44" Mar 18 16:38:53 crc kubenswrapper[4792]: I0318 16:38:53.855093 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:38:53 crc kubenswrapper[4792]: E0318 16:38:53.856087 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:39:08 crc kubenswrapper[4792]: I0318 16:39:08.854806 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:39:08 crc kubenswrapper[4792]: E0318 16:39:08.855695 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:39:20 crc kubenswrapper[4792]: I0318 16:39:20.855644 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:39:20 crc kubenswrapper[4792]: E0318 16:39:20.856738 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:39:31 crc kubenswrapper[4792]: I0318 16:39:31.865738 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:39:31 crc kubenswrapper[4792]: E0318 16:39:31.867170 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:39:47 crc kubenswrapper[4792]: I0318 16:39:47.854742 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:39:47 crc kubenswrapper[4792]: E0318 16:39:47.856076 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:39:59 crc kubenswrapper[4792]: I0318 16:39:59.857275 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:39:59 crc kubenswrapper[4792]: E0318 16:39:59.859016 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.147628 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564200-x4dh8"] Mar 18 16:40:00 crc kubenswrapper[4792]: E0318 16:40:00.148894 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cd98a5-46ee-419c-9cb6-def10cca9727" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.149033 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cd98a5-46ee-419c-9cb6-def10cca9727" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.149461 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cd98a5-46ee-419c-9cb6-def10cca9727" containerName="oc" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.150632 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.153047 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.153580 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.153672 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.160487 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-x4dh8"] Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.250662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhkx\" (UniqueName: \"kubernetes.io/projected/8c690c36-4098-4e0b-a5c0-c386561348a4-kube-api-access-vwhkx\") pod \"auto-csr-approver-29564200-x4dh8\" (UID: \"8c690c36-4098-4e0b-a5c0-c386561348a4\") " pod="openshift-infra/auto-csr-approver-29564200-x4dh8" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.353033 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhkx\" (UniqueName: \"kubernetes.io/projected/8c690c36-4098-4e0b-a5c0-c386561348a4-kube-api-access-vwhkx\") pod \"auto-csr-approver-29564200-x4dh8\" (UID: \"8c690c36-4098-4e0b-a5c0-c386561348a4\") " pod="openshift-infra/auto-csr-approver-29564200-x4dh8" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.387940 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhkx\" (UniqueName: \"kubernetes.io/projected/8c690c36-4098-4e0b-a5c0-c386561348a4-kube-api-access-vwhkx\") pod \"auto-csr-approver-29564200-x4dh8\" (UID: \"8c690c36-4098-4e0b-a5c0-c386561348a4\") " pod="openshift-infra/auto-csr-approver-29564200-x4dh8" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.484134 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" Mar 18 16:40:00 crc kubenswrapper[4792]: I0318 16:40:00.983581 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-x4dh8"] Mar 18 16:40:01 crc kubenswrapper[4792]: I0318 16:40:01.712314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" event={"ID":"8c690c36-4098-4e0b-a5c0-c386561348a4","Type":"ContainerStarted","Data":"2cd07de639da05cdeaa8c50476c40ad79641fc6973f6cbde66c780c06bc096cb"} Mar 18 16:40:02 crc kubenswrapper[4792]: I0318 16:40:02.725867 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" event={"ID":"8c690c36-4098-4e0b-a5c0-c386561348a4","Type":"ContainerStarted","Data":"7d99b79b7bdbe821959b17f37905b2462586afc95c2f276016ece458302d36c2"} Mar 18 16:40:02 crc kubenswrapper[4792]: I0318 16:40:02.775242 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" podStartSLOduration=1.359845469 podStartE2EDuration="2.775219242s" podCreationTimestamp="2026-03-18 16:40:00 +0000 UTC" firstStartedPulling="2026-03-18 16:40:00.989834383 +0000 UTC m=+3949.859163320" lastFinishedPulling="2026-03-18 16:40:02.405208156 +0000 UTC m=+3951.274537093" observedRunningTime="2026-03-18 16:40:02.749699646 +0000 UTC m=+3951.619028583" watchObservedRunningTime="2026-03-18 16:40:02.775219242 +0000 UTC m=+3951.644548179" Mar 18 16:40:03 crc kubenswrapper[4792]: I0318 16:40:03.739011 4792 generic.go:334] "Generic (PLEG): container finished" podID="8c690c36-4098-4e0b-a5c0-c386561348a4" containerID="7d99b79b7bdbe821959b17f37905b2462586afc95c2f276016ece458302d36c2" exitCode=0 Mar 18 16:40:03 crc kubenswrapper[4792]: I0318 16:40:03.739112 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" event={"ID":"8c690c36-4098-4e0b-a5c0-c386561348a4","Type":"ContainerDied","Data":"7d99b79b7bdbe821959b17f37905b2462586afc95c2f276016ece458302d36c2"} Mar 18 16:40:05 crc kubenswrapper[4792]: I0318 16:40:05.168551 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" Mar 18 16:40:05 crc kubenswrapper[4792]: I0318 16:40:05.192418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwhkx\" (UniqueName: \"kubernetes.io/projected/8c690c36-4098-4e0b-a5c0-c386561348a4-kube-api-access-vwhkx\") pod \"8c690c36-4098-4e0b-a5c0-c386561348a4\" (UID: \"8c690c36-4098-4e0b-a5c0-c386561348a4\") " Mar 18 16:40:05 crc kubenswrapper[4792]: I0318 16:40:05.205280 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c690c36-4098-4e0b-a5c0-c386561348a4-kube-api-access-vwhkx" (OuterVolumeSpecName: "kube-api-access-vwhkx") pod "8c690c36-4098-4e0b-a5c0-c386561348a4" (UID: "8c690c36-4098-4e0b-a5c0-c386561348a4"). InnerVolumeSpecName "kube-api-access-vwhkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:40:05 crc kubenswrapper[4792]: I0318 16:40:05.295941 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwhkx\" (UniqueName: \"kubernetes.io/projected/8c690c36-4098-4e0b-a5c0-c386561348a4-kube-api-access-vwhkx\") on node \"crc\" DevicePath \"\"" Mar 18 16:40:05 crc kubenswrapper[4792]: I0318 16:40:05.759290 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" event={"ID":"8c690c36-4098-4e0b-a5c0-c386561348a4","Type":"ContainerDied","Data":"2cd07de639da05cdeaa8c50476c40ad79641fc6973f6cbde66c780c06bc096cb"} Mar 18 16:40:05 crc kubenswrapper[4792]: I0318 16:40:05.759332 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd07de639da05cdeaa8c50476c40ad79641fc6973f6cbde66c780c06bc096cb" Mar 18 16:40:05 crc kubenswrapper[4792]: I0318 16:40:05.759394 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564200-x4dh8" Mar 18 16:40:06 crc kubenswrapper[4792]: I0318 16:40:06.256478 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fpccv"] Mar 18 16:40:06 crc kubenswrapper[4792]: I0318 16:40:06.273161 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564194-fpccv"] Mar 18 16:40:07 crc kubenswrapper[4792]: I0318 16:40:07.875903 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02939277-0f04-4780-87fe-5dad11793e5c" path="/var/lib/kubelet/pods/02939277-0f04-4780-87fe-5dad11793e5c/volumes" Mar 18 16:40:12 crc kubenswrapper[4792]: I0318 16:40:12.854790 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:40:13 crc kubenswrapper[4792]: I0318 16:40:13.841682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"be17e60f0ea37cec338c00c4f2c42e4506f01941ef95b24c1294a99602c8023c"} Mar 18 16:40:50 crc kubenswrapper[4792]: I0318 16:40:50.879091 4792 scope.go:117] "RemoveContainer" containerID="6fd8c20e66eb249409ec65d2f867c21cb8eecf0c8cbdb17cb7137010ca10c1a3" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.307741 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w85nk"] Mar 18 16:41:55 crc kubenswrapper[4792]: E0318 16:41:55.308873 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c690c36-4098-4e0b-a5c0-c386561348a4" containerName="oc" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.308893 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c690c36-4098-4e0b-a5c0-c386561348a4" containerName="oc" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.309175 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c690c36-4098-4e0b-a5c0-c386561348a4" containerName="oc" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.311354 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.324445 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w85nk"] Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.455814 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-catalog-content\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.455912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-utilities\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.456100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fqwc\" (UniqueName: \"kubernetes.io/projected/ec8f1b3d-6c3d-4856-8c2e-e06459806575-kube-api-access-2fqwc\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.558912 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-catalog-content\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.559047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-utilities\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.559092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fqwc\" (UniqueName: \"kubernetes.io/projected/ec8f1b3d-6c3d-4856-8c2e-e06459806575-kube-api-access-2fqwc\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.559814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-catalog-content\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.559866 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-utilities\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.582366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fqwc\" (UniqueName: \"kubernetes.io/projected/ec8f1b3d-6c3d-4856-8c2e-e06459806575-kube-api-access-2fqwc\") pod \"redhat-operators-w85nk\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:55 crc kubenswrapper[4792]: I0318 16:41:55.641134 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:41:56 crc kubenswrapper[4792]: I0318 16:41:56.203085 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w85nk"] Mar 18 16:41:57 crc kubenswrapper[4792]: I0318 16:41:57.098949 4792 generic.go:334] "Generic (PLEG): container finished" podID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerID="96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2" exitCode=0 Mar 18 16:41:57 crc kubenswrapper[4792]: I0318 16:41:57.099008 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w85nk" event={"ID":"ec8f1b3d-6c3d-4856-8c2e-e06459806575","Type":"ContainerDied","Data":"96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2"} Mar 18 16:41:57 crc kubenswrapper[4792]: I0318 16:41:57.099291 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w85nk" event={"ID":"ec8f1b3d-6c3d-4856-8c2e-e06459806575","Type":"ContainerStarted","Data":"f341dbcb0df2953d5dfe9f7dbf11761f9683b8ac3e54581e8c8c53415b5b353d"} Mar 18 16:41:57 crc kubenswrapper[4792]: I0318 16:41:57.101742 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.147798 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w85nk" event={"ID":"ec8f1b3d-6c3d-4856-8c2e-e06459806575","Type":"ContainerStarted","Data":"93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32"} Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.152542 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564202-4256b"] Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.154640 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-4256b" Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.157193 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.157792 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.162587 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.180100 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-4256b"] Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.292111 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzp4j\" (UniqueName: \"kubernetes.io/projected/ef8a5198-8500-4175-b58c-f7849d808c26-kube-api-access-hzp4j\") pod \"auto-csr-approver-29564202-4256b\" (UID: \"ef8a5198-8500-4175-b58c-f7849d808c26\") " pod="openshift-infra/auto-csr-approver-29564202-4256b" Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.394314 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzp4j\" (UniqueName: \"kubernetes.io/projected/ef8a5198-8500-4175-b58c-f7849d808c26-kube-api-access-hzp4j\") pod \"auto-csr-approver-29564202-4256b\" (UID: \"ef8a5198-8500-4175-b58c-f7849d808c26\") " pod="openshift-infra/auto-csr-approver-29564202-4256b" Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.429270 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzp4j\" (UniqueName: \"kubernetes.io/projected/ef8a5198-8500-4175-b58c-f7849d808c26-kube-api-access-hzp4j\") pod \"auto-csr-approver-29564202-4256b\" (UID: \"ef8a5198-8500-4175-b58c-f7849d808c26\") " pod="openshift-infra/auto-csr-approver-29564202-4256b" Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.476719 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-4256b" Mar 18 16:42:00 crc kubenswrapper[4792]: W0318 16:42:00.972052 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef8a5198_8500_4175_b58c_f7849d808c26.slice/crio-129f80499150454bdde3710b90222a9568bd4e49a8f0fe094766b9676bd1c9c3 WatchSource:0}: Error finding container 129f80499150454bdde3710b90222a9568bd4e49a8f0fe094766b9676bd1c9c3: Status 404 returned error can't find the container with id 129f80499150454bdde3710b90222a9568bd4e49a8f0fe094766b9676bd1c9c3 Mar 18 16:42:00 crc kubenswrapper[4792]: I0318 16:42:00.974379 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-4256b"] Mar 18 16:42:01 crc kubenswrapper[4792]: I0318 16:42:01.161439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-4256b" event={"ID":"ef8a5198-8500-4175-b58c-f7849d808c26","Type":"ContainerStarted","Data":"129f80499150454bdde3710b90222a9568bd4e49a8f0fe094766b9676bd1c9c3"} Mar 18 16:42:03 crc kubenswrapper[4792]: I0318 16:42:03.188493 4792 generic.go:334] "Generic (PLEG): container finished" podID="ef8a5198-8500-4175-b58c-f7849d808c26" containerID="99458f12a7ba6e80f728170a7738a7a7b5a41d54d3d46204852457a696b1ddbe" exitCode=0 Mar 18 16:42:03 crc kubenswrapper[4792]: I0318 16:42:03.188554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-4256b" event={"ID":"ef8a5198-8500-4175-b58c-f7849d808c26","Type":"ContainerDied","Data":"99458f12a7ba6e80f728170a7738a7a7b5a41d54d3d46204852457a696b1ddbe"} Mar 18 16:42:04 crc kubenswrapper[4792]: I0318 16:42:04.767233 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-4256b" Mar 18 16:42:04 crc kubenswrapper[4792]: I0318 16:42:04.917751 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzp4j\" (UniqueName: \"kubernetes.io/projected/ef8a5198-8500-4175-b58c-f7849d808c26-kube-api-access-hzp4j\") pod \"ef8a5198-8500-4175-b58c-f7849d808c26\" (UID: \"ef8a5198-8500-4175-b58c-f7849d808c26\") " Mar 18 16:42:04 crc kubenswrapper[4792]: I0318 16:42:04.926306 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8a5198-8500-4175-b58c-f7849d808c26-kube-api-access-hzp4j" (OuterVolumeSpecName: "kube-api-access-hzp4j") pod "ef8a5198-8500-4175-b58c-f7849d808c26" (UID: "ef8a5198-8500-4175-b58c-f7849d808c26"). InnerVolumeSpecName "kube-api-access-hzp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.021642 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzp4j\" (UniqueName: \"kubernetes.io/projected/ef8a5198-8500-4175-b58c-f7849d808c26-kube-api-access-hzp4j\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.214080 4792 generic.go:334] "Generic (PLEG): container finished" podID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerID="93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32" exitCode=0 Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.214213 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w85nk" event={"ID":"ec8f1b3d-6c3d-4856-8c2e-e06459806575","Type":"ContainerDied","Data":"93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32"} Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.217079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564202-4256b" event={"ID":"ef8a5198-8500-4175-b58c-f7849d808c26","Type":"ContainerDied","Data":"129f80499150454bdde3710b90222a9568bd4e49a8f0fe094766b9676bd1c9c3"} Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.217109 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="129f80499150454bdde3710b90222a9568bd4e49a8f0fe094766b9676bd1c9c3" Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.217157 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564202-4256b" Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.875289 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-5pb4t"] Mar 18 16:42:05 crc kubenswrapper[4792]: I0318 16:42:05.878312 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564196-5pb4t"] Mar 18 16:42:06 crc kubenswrapper[4792]: I0318 16:42:06.234044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w85nk" event={"ID":"ec8f1b3d-6c3d-4856-8c2e-e06459806575","Type":"ContainerStarted","Data":"854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630"} Mar 18 16:42:06 crc kubenswrapper[4792]: I0318 16:42:06.264488 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w85nk" podStartSLOduration=2.470897216 podStartE2EDuration="11.264468681s" podCreationTimestamp="2026-03-18 16:41:55 +0000 UTC" firstStartedPulling="2026-03-18 16:41:57.101445768 +0000 UTC m=+4065.970774705" lastFinishedPulling="2026-03-18 16:42:05.895017233 +0000 UTC m=+4074.764346170" observedRunningTime="2026-03-18 16:42:06.252900176 +0000 UTC m=+4075.122229113" watchObservedRunningTime="2026-03-18 16:42:06.264468681 +0000 UTC m=+4075.133797608" Mar 18 16:42:07 crc kubenswrapper[4792]: I0318 16:42:07.870928 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbef5a02-ad3a-472e-b026-1151782480f9" path="/var/lib/kubelet/pods/fbef5a02-ad3a-472e-b026-1151782480f9/volumes" Mar 18 16:42:15 crc kubenswrapper[4792]: I0318 16:42:15.641496 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:42:15 crc kubenswrapper[4792]: I0318 16:42:15.642140 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:42:16 crc kubenswrapper[4792]: I0318 16:42:16.698168 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w85nk" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="registry-server" probeResult="failure" output=< Mar 18 16:42:16 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:42:16 crc kubenswrapper[4792]: > Mar 18 16:42:26 crc kubenswrapper[4792]: I0318 16:42:26.695880 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w85nk" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="registry-server" probeResult="failure" output=< Mar 18 16:42:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:42:26 crc kubenswrapper[4792]: > Mar 18 16:42:30 crc kubenswrapper[4792]: I0318 16:42:30.322195 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:42:30 crc kubenswrapper[4792]: I0318 16:42:30.322747 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:42:35 crc kubenswrapper[4792]: I0318 16:42:35.693798 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:42:35 crc kubenswrapper[4792]: I0318 16:42:35.755772 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:42:35 crc kubenswrapper[4792]: I0318 16:42:35.938490 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w85nk"] Mar 18 16:42:37 crc kubenswrapper[4792]: I0318 16:42:37.604149 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w85nk" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="registry-server" containerID="cri-o://854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630" gracePeriod=2 Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.132802 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.209807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-catalog-content\") pod \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.210336 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fqwc\" (UniqueName: \"kubernetes.io/projected/ec8f1b3d-6c3d-4856-8c2e-e06459806575-kube-api-access-2fqwc\") pod \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.210447 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-utilities\") pod \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\" (UID: \"ec8f1b3d-6c3d-4856-8c2e-e06459806575\") " Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.211314 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-utilities" (OuterVolumeSpecName: "utilities") pod "ec8f1b3d-6c3d-4856-8c2e-e06459806575" (UID: "ec8f1b3d-6c3d-4856-8c2e-e06459806575"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.212106 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.219420 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8f1b3d-6c3d-4856-8c2e-e06459806575-kube-api-access-2fqwc" (OuterVolumeSpecName: "kube-api-access-2fqwc") pod "ec8f1b3d-6c3d-4856-8c2e-e06459806575" (UID: "ec8f1b3d-6c3d-4856-8c2e-e06459806575"). InnerVolumeSpecName "kube-api-access-2fqwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.314784 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fqwc\" (UniqueName: \"kubernetes.io/projected/ec8f1b3d-6c3d-4856-8c2e-e06459806575-kube-api-access-2fqwc\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.337291 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec8f1b3d-6c3d-4856-8c2e-e06459806575" (UID: "ec8f1b3d-6c3d-4856-8c2e-e06459806575"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.416933 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f1b3d-6c3d-4856-8c2e-e06459806575-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.618449 4792 generic.go:334] "Generic (PLEG): container finished" podID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerID="854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630" exitCode=0 Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.618558 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w85nk" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.619111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w85nk" event={"ID":"ec8f1b3d-6c3d-4856-8c2e-e06459806575","Type":"ContainerDied","Data":"854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630"} Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.619169 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w85nk" event={"ID":"ec8f1b3d-6c3d-4856-8c2e-e06459806575","Type":"ContainerDied","Data":"f341dbcb0df2953d5dfe9f7dbf11761f9683b8ac3e54581e8c8c53415b5b353d"} Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.619195 4792 scope.go:117] "RemoveContainer" containerID="854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.647478 4792 scope.go:117] "RemoveContainer" containerID="93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.662191 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w85nk"] Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.675750 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w85nk"] Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.692481 4792 scope.go:117] "RemoveContainer" containerID="96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.740298 4792 scope.go:117] "RemoveContainer" containerID="854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630" Mar 18 16:42:38 crc kubenswrapper[4792]: E0318 16:42:38.740895 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630\": container with ID starting with 854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630 not found: ID does not exist" containerID="854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.740961 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630"} err="failed to get container status \"854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630\": rpc error: code = NotFound desc = could not find container \"854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630\": container with ID starting with 854acbf40f2bb2687edc45b4be92c70ee5271690938d77184570f4d589851630 not found: ID does not exist" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.741018 4792 scope.go:117] "RemoveContainer" containerID="93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32" Mar 18 16:42:38 crc kubenswrapper[4792]: E0318 16:42:38.742291 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32\": container with ID starting with 93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32 not found: ID does not exist" containerID="93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.742583 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32"} err="failed to get container status \"93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32\": rpc error: code = NotFound desc = could not find container \"93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32\": container with ID starting with 93358bad1a0006d42712efb4380ab226fadeb4ed06ef261b91cb52cc7a477e32 not found: ID does not exist" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.742606 4792 scope.go:117] "RemoveContainer" containerID="96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2" Mar 18 16:42:38 crc kubenswrapper[4792]: E0318 16:42:38.742992 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2\": container with ID starting with 96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2 not found: ID does not exist" containerID="96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2" Mar 18 16:42:38 crc kubenswrapper[4792]: I0318 16:42:38.743038 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2"} err="failed to get container status \"96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2\": rpc error: code = NotFound desc = could not find container \"96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2\": container with ID starting with 96d2b64c5902c4f73f0cce714ed44eb7f7a53ce5550373135102ad539b9bfed2 not found: ID does not exist" Mar 18 16:42:39 crc kubenswrapper[4792]: I0318 16:42:39.868663 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" path="/var/lib/kubelet/pods/ec8f1b3d-6c3d-4856-8c2e-e06459806575/volumes" Mar 18 16:42:50 crc kubenswrapper[4792]: I0318 16:42:50.989951 4792 scope.go:117] "RemoveContainer" containerID="858bde005097f9d76fd861e0958c9f16acc968dd662a90dcdcc276c1a0cd2863" Mar 18 16:43:00 crc kubenswrapper[4792]: I0318 16:43:00.321584 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:43:00 crc kubenswrapper[4792]: I0318 16:43:00.322207 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.304825 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s6qv8"] Mar 18 16:43:01 crc kubenswrapper[4792]: E0318 16:43:01.305750 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="extract-utilities" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.305773 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="extract-utilities" Mar 18 16:43:01 crc kubenswrapper[4792]: E0318 16:43:01.305800 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="extract-content" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.305809 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="extract-content" Mar 18 16:43:01 crc kubenswrapper[4792]: E0318 16:43:01.305829 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8a5198-8500-4175-b58c-f7849d808c26" containerName="oc" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.305837 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8a5198-8500-4175-b58c-f7849d808c26" containerName="oc" Mar 18 16:43:01 crc kubenswrapper[4792]: E0318 16:43:01.305851 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="registry-server" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.305858 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="registry-server" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.306202 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8a5198-8500-4175-b58c-f7849d808c26" containerName="oc" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.306249 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8f1b3d-6c3d-4856-8c2e-e06459806575" containerName="registry-server" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.309245 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.328561 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6qv8"] Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.394286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-catalog-content\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.394564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-utilities\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.394595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvgpm\" (UniqueName: \"kubernetes.io/projected/398d3e4e-4482-4625-9a09-3386ce1b7b56-kube-api-access-kvgpm\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.497381 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-utilities\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.497784 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvgpm\" (UniqueName: \"kubernetes.io/projected/398d3e4e-4482-4625-9a09-3386ce1b7b56-kube-api-access-kvgpm\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.497903 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-catalog-content\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.498200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-utilities\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.498383 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-catalog-content\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.519920 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvgpm\" (UniqueName: \"kubernetes.io/projected/398d3e4e-4482-4625-9a09-3386ce1b7b56-kube-api-access-kvgpm\") pod \"redhat-marketplace-s6qv8\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:01 crc kubenswrapper[4792]: I0318 16:43:01.645514 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:02 crc kubenswrapper[4792]: I0318 16:43:02.132911 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6qv8"] Mar 18 16:43:02 crc kubenswrapper[4792]: I0318 16:43:02.910279 4792 generic.go:334] "Generic (PLEG): container finished" podID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerID="df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982" exitCode=0 Mar 18 16:43:02 crc kubenswrapper[4792]: I0318 16:43:02.911369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6qv8" event={"ID":"398d3e4e-4482-4625-9a09-3386ce1b7b56","Type":"ContainerDied","Data":"df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982"} Mar 18 16:43:02 crc kubenswrapper[4792]: I0318 16:43:02.911450 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6qv8" event={"ID":"398d3e4e-4482-4625-9a09-3386ce1b7b56","Type":"ContainerStarted","Data":"b22df5855c5dbddbec9adb94b7396a3a501d3d3f3069c1bb42f0ea90f00e3b7e"} Mar 18 16:43:04 crc kubenswrapper[4792]: I0318 16:43:04.946349 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6qv8" event={"ID":"398d3e4e-4482-4625-9a09-3386ce1b7b56","Type":"ContainerStarted","Data":"50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af"} Mar 18 16:43:05 crc kubenswrapper[4792]: I0318 16:43:05.962440 4792 generic.go:334] "Generic (PLEG): container finished" podID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerID="50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af" exitCode=0 Mar 18 16:43:05 crc kubenswrapper[4792]: I0318 16:43:05.962546 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6qv8" event={"ID":"398d3e4e-4482-4625-9a09-3386ce1b7b56","Type":"ContainerDied","Data":"50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af"} Mar 18 16:43:05 crc kubenswrapper[4792]: I0318 16:43:05.963084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6qv8" event={"ID":"398d3e4e-4482-4625-9a09-3386ce1b7b56","Type":"ContainerStarted","Data":"0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937"} Mar 18 16:43:05 crc kubenswrapper[4792]: I0318 16:43:05.991486 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s6qv8" podStartSLOduration=2.279757481 podStartE2EDuration="4.991444606s" podCreationTimestamp="2026-03-18 16:43:01 +0000 UTC" firstStartedPulling="2026-03-18 16:43:02.91467708 +0000 UTC m=+4131.784006017" lastFinishedPulling="2026-03-18 16:43:05.626364175 +0000 UTC m=+4134.495693142" observedRunningTime="2026-03-18 16:43:05.982683659 +0000 UTC m=+4134.852012596" watchObservedRunningTime="2026-03-18 16:43:05.991444606 +0000 UTC m=+4134.860773543" Mar 18 16:43:11 crc kubenswrapper[4792]: I0318 16:43:11.646350 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:11 crc kubenswrapper[4792]: I0318 16:43:11.647011 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:11 crc kubenswrapper[4792]: I0318 16:43:11.705467 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:12 crc kubenswrapper[4792]: I0318 16:43:12.092044 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:12 crc kubenswrapper[4792]: I0318 16:43:12.148200 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6qv8"] Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.048163 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s6qv8" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="registry-server" containerID="cri-o://0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937" gracePeriod=2 Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.610672 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.764474 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvgpm\" (UniqueName: \"kubernetes.io/projected/398d3e4e-4482-4625-9a09-3386ce1b7b56-kube-api-access-kvgpm\") pod \"398d3e4e-4482-4625-9a09-3386ce1b7b56\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.764637 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-utilities\") pod \"398d3e4e-4482-4625-9a09-3386ce1b7b56\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.764816 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-catalog-content\") pod \"398d3e4e-4482-4625-9a09-3386ce1b7b56\" (UID: \"398d3e4e-4482-4625-9a09-3386ce1b7b56\") " Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.769920 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-utilities" (OuterVolumeSpecName: "utilities") pod "398d3e4e-4482-4625-9a09-3386ce1b7b56" (UID: "398d3e4e-4482-4625-9a09-3386ce1b7b56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.776109 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398d3e4e-4482-4625-9a09-3386ce1b7b56-kube-api-access-kvgpm" (OuterVolumeSpecName: "kube-api-access-kvgpm") pod "398d3e4e-4482-4625-9a09-3386ce1b7b56" (UID: "398d3e4e-4482-4625-9a09-3386ce1b7b56"). InnerVolumeSpecName "kube-api-access-kvgpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.856859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "398d3e4e-4482-4625-9a09-3386ce1b7b56" (UID: "398d3e4e-4482-4625-9a09-3386ce1b7b56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.868097 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.868125 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398d3e4e-4482-4625-9a09-3386ce1b7b56-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:43:14 crc kubenswrapper[4792]: I0318 16:43:14.868136 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvgpm\" (UniqueName: \"kubernetes.io/projected/398d3e4e-4482-4625-9a09-3386ce1b7b56-kube-api-access-kvgpm\") on node \"crc\" DevicePath \"\"" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.074796 4792 generic.go:334] "Generic (PLEG): container finished" podID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerID="0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937" exitCode=0 Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.074852 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6qv8" event={"ID":"398d3e4e-4482-4625-9a09-3386ce1b7b56","Type":"ContainerDied","Data":"0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937"} Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.074885 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s6qv8" event={"ID":"398d3e4e-4482-4625-9a09-3386ce1b7b56","Type":"ContainerDied","Data":"b22df5855c5dbddbec9adb94b7396a3a501d3d3f3069c1bb42f0ea90f00e3b7e"} Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.074905 4792 scope.go:117] "RemoveContainer" containerID="0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.074929 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s6qv8" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.112499 4792 scope.go:117] "RemoveContainer" containerID="50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.126328 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6qv8"] Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.145313 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s6qv8"] Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.164814 4792 scope.go:117] "RemoveContainer" containerID="df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.205784 4792 scope.go:117] "RemoveContainer" containerID="0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937" Mar 18 16:43:15 crc kubenswrapper[4792]: E0318 16:43:15.206556 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937\": container with ID starting with 0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937 not found: ID does not exist" containerID="0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.206689 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937"} err="failed to get container status \"0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937\": rpc error: code = NotFound desc = could not find container \"0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937\": container with ID starting with 0df0b0606bcda990e259b7717419f1210ccba9b8a3da3eeab374ea1248c88937 not found: ID does not exist" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.206737 4792 scope.go:117] "RemoveContainer" containerID="50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af" Mar 18 16:43:15 crc kubenswrapper[4792]: E0318 16:43:15.207293 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af\": container with ID starting with 50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af not found: ID does not exist" containerID="50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.207333 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af"} err="failed to get container status \"50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af\": rpc error: code = NotFound desc = could not find container \"50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af\": container with ID starting with 50df4ec9544117255ccc79f72817e6a729ec01f6564cbb3f8818ee0c39b416af not found: ID does not exist" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.207362 4792 scope.go:117] "RemoveContainer" containerID="df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982" Mar 18 16:43:15 crc kubenswrapper[4792]: E0318 16:43:15.207787 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982\": container with ID starting with df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982 not found: ID does not exist" containerID="df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.207843 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982"} err="failed to get container status \"df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982\": rpc error: code = NotFound desc = could not find container \"df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982\": container with ID starting with df923f7d2d1e8a62faf1c3f880b306845f56477972cbe662a7ac60fcf750a982 not found: ID does not exist" Mar 18 16:43:15 crc kubenswrapper[4792]: I0318 16:43:15.872386 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" path="/var/lib/kubelet/pods/398d3e4e-4482-4625-9a09-3386ce1b7b56/volumes" Mar 18 16:43:30 crc kubenswrapper[4792]: I0318 16:43:30.321803 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:43:30 crc kubenswrapper[4792]: I0318 16:43:30.322810 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:43:30 crc kubenswrapper[4792]: I0318 16:43:30.322880 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:43:30 crc kubenswrapper[4792]: I0318 16:43:30.324322 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be17e60f0ea37cec338c00c4f2c42e4506f01941ef95b24c1294a99602c8023c"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:43:30 crc kubenswrapper[4792]: I0318 16:43:30.324395 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://be17e60f0ea37cec338c00c4f2c42e4506f01941ef95b24c1294a99602c8023c" gracePeriod=600 Mar 18 16:43:30 crc kubenswrapper[4792]: E0318 16:43:30.845611 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51cef14_7d91_4e08_8045_831f7a9a65f8.slice/crio-be17e60f0ea37cec338c00c4f2c42e4506f01941ef95b24c1294a99602c8023c.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:43:31 crc kubenswrapper[4792]: I0318 16:43:31.266067 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="be17e60f0ea37cec338c00c4f2c42e4506f01941ef95b24c1294a99602c8023c" exitCode=0 Mar 18 16:43:31 crc kubenswrapper[4792]: I0318 16:43:31.266180 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"be17e60f0ea37cec338c00c4f2c42e4506f01941ef95b24c1294a99602c8023c"} Mar 18 16:43:31 crc kubenswrapper[4792]: I0318 16:43:31.266593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470"} Mar 18 16:43:31 crc kubenswrapper[4792]: I0318 16:43:31.266645 4792 scope.go:117] "RemoveContainer" containerID="b4fa36de7ecb61fdfacfce82af90632f89d98088f07faa64133d3e827002bde7" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.156249 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564204-wv5dx"] Mar 18 16:44:00 crc kubenswrapper[4792]: E0318 16:44:00.157514 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="extract-content" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.157535 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="extract-content" Mar 18 16:44:00 crc kubenswrapper[4792]: E0318 16:44:00.157565 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="registry-server" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.157575 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="registry-server" Mar 18 16:44:00 crc kubenswrapper[4792]: E0318 16:44:00.157625 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="extract-utilities" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.157634 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="extract-utilities" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.157928 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="398d3e4e-4482-4625-9a09-3386ce1b7b56" containerName="registry-server" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.159120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-wv5dx" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.162953 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.163306 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.164582 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.170639 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-wv5dx"] Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.272502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkq4z\" (UniqueName: \"kubernetes.io/projected/bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1-kube-api-access-wkq4z\") pod \"auto-csr-approver-29564204-wv5dx\" (UID: \"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1\") " pod="openshift-infra/auto-csr-approver-29564204-wv5dx" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.375184 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkq4z\" (UniqueName: \"kubernetes.io/projected/bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1-kube-api-access-wkq4z\") pod \"auto-csr-approver-29564204-wv5dx\" (UID: \"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1\") " pod="openshift-infra/auto-csr-approver-29564204-wv5dx" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.396442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkq4z\" (UniqueName: \"kubernetes.io/projected/bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1-kube-api-access-wkq4z\") pod \"auto-csr-approver-29564204-wv5dx\" (UID: \"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1\") " pod="openshift-infra/auto-csr-approver-29564204-wv5dx" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.486477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-wv5dx" Mar 18 16:44:00 crc kubenswrapper[4792]: I0318 16:44:00.967802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-wv5dx"] Mar 18 16:44:01 crc kubenswrapper[4792]: I0318 16:44:01.715171 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-wv5dx" event={"ID":"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1","Type":"ContainerStarted","Data":"6159c61d9f123ebd2d4f24b9b5af9d6e667903abee54462470ebfd95f7bda249"} Mar 18 16:44:03 crc kubenswrapper[4792]: I0318 16:44:03.747231 4792 generic.go:334] "Generic (PLEG): container finished" podID="bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1" containerID="648546e75c13dbabb334fb8764d6e17965b3291bba0ea35d3d004e6fe46c3f69" exitCode=0 Mar 18 16:44:03 crc kubenswrapper[4792]: I0318 16:44:03.747288 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-wv5dx" event={"ID":"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1","Type":"ContainerDied","Data":"648546e75c13dbabb334fb8764d6e17965b3291bba0ea35d3d004e6fe46c3f69"} Mar 18 16:44:05 crc kubenswrapper[4792]: I0318 16:44:05.215826 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-wv5dx" Mar 18 16:44:05 crc kubenswrapper[4792]: I0318 16:44:05.317088 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkq4z\" (UniqueName: \"kubernetes.io/projected/bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1-kube-api-access-wkq4z\") pod \"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1\" (UID: \"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1\") " Mar 18 16:44:05 crc kubenswrapper[4792]: I0318 16:44:05.323933 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1-kube-api-access-wkq4z" (OuterVolumeSpecName: "kube-api-access-wkq4z") pod "bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1" (UID: "bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1"). InnerVolumeSpecName "kube-api-access-wkq4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:44:05 crc kubenswrapper[4792]: I0318 16:44:05.420001 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkq4z\" (UniqueName: \"kubernetes.io/projected/bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1-kube-api-access-wkq4z\") on node \"crc\" DevicePath \"\"" Mar 18 16:44:05 crc kubenswrapper[4792]: I0318 16:44:05.770482 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564204-wv5dx" event={"ID":"bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1","Type":"ContainerDied","Data":"6159c61d9f123ebd2d4f24b9b5af9d6e667903abee54462470ebfd95f7bda249"} Mar 18 16:44:05 crc kubenswrapper[4792]: I0318 16:44:05.770523 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564204-wv5dx" Mar 18 16:44:05 crc kubenswrapper[4792]: I0318 16:44:05.770538 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6159c61d9f123ebd2d4f24b9b5af9d6e667903abee54462470ebfd95f7bda249" Mar 18 16:44:06 crc kubenswrapper[4792]: I0318 16:44:06.291311 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-9bhb2"] Mar 18 16:44:06 crc kubenswrapper[4792]: I0318 16:44:06.303639 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564198-9bhb2"] Mar 18 16:44:07 crc kubenswrapper[4792]: I0318 16:44:07.878795 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cd98a5-46ee-419c-9cb6-def10cca9727" path="/var/lib/kubelet/pods/48cd98a5-46ee-419c-9cb6-def10cca9727/volumes" Mar 18 16:44:51 crc kubenswrapper[4792]: I0318 16:44:51.148778 4792 scope.go:117] "RemoveContainer" containerID="d85f248843d8e50213742db284e592c4601f2d22183ecb2a6110297b88693917" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.171845 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl"] Mar 18 16:45:00 crc kubenswrapper[4792]: E0318 16:45:00.173214 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1" containerName="oc" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.173237 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1" containerName="oc" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.173549 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1" containerName="oc" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.174617 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.178938 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.179093 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.185699 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl"] Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.373560 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-secret-volume\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.373651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbkbg\" (UniqueName: \"kubernetes.io/projected/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-kube-api-access-bbkbg\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.373768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-config-volume\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.476468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-secret-volume\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.476639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbkbg\" (UniqueName: \"kubernetes.io/projected/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-kube-api-access-bbkbg\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.477300 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-config-volume\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.478410 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-config-volume\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.495313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-secret-volume\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.498127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbkbg\" (UniqueName: \"kubernetes.io/projected/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-kube-api-access-bbkbg\") pod \"collect-profiles-29564205-z2pwl\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:00 crc kubenswrapper[4792]: I0318 16:45:00.515400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:01 crc kubenswrapper[4792]: I0318 16:45:01.082141 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl"] Mar 18 16:45:02 crc kubenswrapper[4792]: I0318 16:45:02.462831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" event={"ID":"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f","Type":"ContainerStarted","Data":"08abcf6792a1be6530668bffc5e2a1feebfff57dba348fbe04771480d6f4c94a"} Mar 18 16:45:02 crc kubenswrapper[4792]: I0318 16:45:02.463210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" event={"ID":"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f","Type":"ContainerStarted","Data":"57969a1b67c2341573540af6d1b94d7a4f1cf467f010e2f6b6cd3448b8b981ee"} Mar 18 16:45:02 crc kubenswrapper[4792]: I0318 16:45:02.498737 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" podStartSLOduration=2.498713852 podStartE2EDuration="2.498713852s" podCreationTimestamp="2026-03-18 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:45:02.48172469 +0000 UTC m=+4251.351053627" watchObservedRunningTime="2026-03-18 16:45:02.498713852 +0000 UTC m=+4251.368042789" Mar 18 16:45:03 crc kubenswrapper[4792]: I0318 16:45:03.483465 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0bb1fb6-5f01-48a4-9b63-124c45d48d1f" containerID="08abcf6792a1be6530668bffc5e2a1feebfff57dba348fbe04771480d6f4c94a" exitCode=0 Mar 18 16:45:03 crc kubenswrapper[4792]: I0318 16:45:03.483732 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" event={"ID":"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f","Type":"ContainerDied","Data":"08abcf6792a1be6530668bffc5e2a1feebfff57dba348fbe04771480d6f4c94a"} Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.101468 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.248426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-config-volume\") pod \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.248742 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-secret-volume\") pod \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.248863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbkbg\" (UniqueName: \"kubernetes.io/projected/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-kube-api-access-bbkbg\") pod \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\" (UID: \"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f\") " Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.249799 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c0bb1fb6-5f01-48a4-9b63-124c45d48d1f" (UID: "c0bb1fb6-5f01-48a4-9b63-124c45d48d1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.258285 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c0bb1fb6-5f01-48a4-9b63-124c45d48d1f" (UID: "c0bb1fb6-5f01-48a4-9b63-124c45d48d1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.258295 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-kube-api-access-bbkbg" (OuterVolumeSpecName: "kube-api-access-bbkbg") pod "c0bb1fb6-5f01-48a4-9b63-124c45d48d1f" (UID: "c0bb1fb6-5f01-48a4-9b63-124c45d48d1f"). InnerVolumeSpecName "kube-api-access-bbkbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.352284 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbkbg\" (UniqueName: \"kubernetes.io/projected/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-kube-api-access-bbkbg\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.352328 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.352346 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0bb1fb6-5f01-48a4-9b63-124c45d48d1f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.528643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" event={"ID":"c0bb1fb6-5f01-48a4-9b63-124c45d48d1f","Type":"ContainerDied","Data":"57969a1b67c2341573540af6d1b94d7a4f1cf467f010e2f6b6cd3448b8b981ee"} Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.529174 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57969a1b67c2341573540af6d1b94d7a4f1cf467f010e2f6b6cd3448b8b981ee" Mar 18 16:45:05 crc kubenswrapper[4792]: I0318 16:45:05.529243 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564205-z2pwl" Mar 18 16:45:06 crc kubenswrapper[4792]: I0318 16:45:06.236414 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m"] Mar 18 16:45:06 crc kubenswrapper[4792]: I0318 16:45:06.250708 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564160-4bs6m"] Mar 18 16:45:07 crc kubenswrapper[4792]: I0318 16:45:07.869391 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e39200-163d-47f3-a6bb-41fb28052c25" path="/var/lib/kubelet/pods/c5e39200-163d-47f3-a6bb-41fb28052c25/volumes" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.192823 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hx6lv"] Mar 18 16:45:27 crc kubenswrapper[4792]: E0318 16:45:27.193821 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bb1fb6-5f01-48a4-9b63-124c45d48d1f" containerName="collect-profiles" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.193840 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bb1fb6-5f01-48a4-9b63-124c45d48d1f" containerName="collect-profiles" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.194181 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0bb1fb6-5f01-48a4-9b63-124c45d48d1f" containerName="collect-profiles" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.197076 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.211422 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hx6lv"] Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.280546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-catalog-content\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.280604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c6sh\" (UniqueName: \"kubernetes.io/projected/9131d853-3eeb-47e2-b585-a5a18adb13e6-kube-api-access-6c6sh\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.280647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-utilities\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.383059 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-catalog-content\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.383114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c6sh\" (UniqueName: \"kubernetes.io/projected/9131d853-3eeb-47e2-b585-a5a18adb13e6-kube-api-access-6c6sh\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.383163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-utilities\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.383859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-utilities\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.384207 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-catalog-content\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.409159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c6sh\" (UniqueName: \"kubernetes.io/projected/9131d853-3eeb-47e2-b585-a5a18adb13e6-kube-api-access-6c6sh\") pod \"community-operators-hx6lv\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:27 crc kubenswrapper[4792]: I0318 16:45:27.522951 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:28 crc kubenswrapper[4792]: I0318 16:45:28.242709 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hx6lv"] Mar 18 16:45:28 crc kubenswrapper[4792]: I0318 16:45:28.830985 4792 generic.go:334] "Generic (PLEG): container finished" podID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerID="e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a" exitCode=0 Mar 18 16:45:28 crc kubenswrapper[4792]: I0318 16:45:28.831071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hx6lv" event={"ID":"9131d853-3eeb-47e2-b585-a5a18adb13e6","Type":"ContainerDied","Data":"e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a"} Mar 18 16:45:28 crc kubenswrapper[4792]: I0318 16:45:28.831306 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hx6lv" event={"ID":"9131d853-3eeb-47e2-b585-a5a18adb13e6","Type":"ContainerStarted","Data":"b7eaf262d03810519d7386d1040aad3f0f00b07c84201c96cd340c2de97deac9"} Mar 18 16:45:29 crc kubenswrapper[4792]: I0318 16:45:29.848779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hx6lv" event={"ID":"9131d853-3eeb-47e2-b585-a5a18adb13e6","Type":"ContainerStarted","Data":"c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20"} Mar 18 16:45:31 crc kubenswrapper[4792]: I0318 16:45:31.878516 4792 generic.go:334] "Generic (PLEG): container finished" podID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerID="c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20" exitCode=0 Mar 18 16:45:31 crc kubenswrapper[4792]: I0318 16:45:31.879103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hx6lv" event={"ID":"9131d853-3eeb-47e2-b585-a5a18adb13e6","Type":"ContainerDied","Data":"c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20"} Mar 18 16:45:32 crc kubenswrapper[4792]: I0318 16:45:32.893474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hx6lv" event={"ID":"9131d853-3eeb-47e2-b585-a5a18adb13e6","Type":"ContainerStarted","Data":"aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc"} Mar 18 16:45:32 crc kubenswrapper[4792]: I0318 16:45:32.920186 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hx6lv" podStartSLOduration=2.254897884 podStartE2EDuration="5.92016811s" podCreationTimestamp="2026-03-18 16:45:27 +0000 UTC" firstStartedPulling="2026-03-18 16:45:28.83523748 +0000 UTC m=+4277.704566417" lastFinishedPulling="2026-03-18 16:45:32.500507696 +0000 UTC m=+4281.369836643" observedRunningTime="2026-03-18 16:45:32.912343521 +0000 UTC m=+4281.781672458" watchObservedRunningTime="2026-03-18 16:45:32.92016811 +0000 UTC m=+4281.789497047" Mar 18 16:45:37 crc kubenswrapper[4792]: I0318 16:45:37.524255 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:37 crc kubenswrapper[4792]: I0318 16:45:37.524858 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:37 crc kubenswrapper[4792]: I0318 16:45:37.581204 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:37 crc kubenswrapper[4792]: I0318 16:45:37.993930 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:38 crc kubenswrapper[4792]: I0318 16:45:38.063599 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hx6lv"] Mar 18 16:45:39 crc kubenswrapper[4792]: I0318 16:45:39.962566 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hx6lv" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="registry-server" containerID="cri-o://aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc" gracePeriod=2 Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.593798 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.783041 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-utilities\") pod \"9131d853-3eeb-47e2-b585-a5a18adb13e6\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.783678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c6sh\" (UniqueName: \"kubernetes.io/projected/9131d853-3eeb-47e2-b585-a5a18adb13e6-kube-api-access-6c6sh\") pod \"9131d853-3eeb-47e2-b585-a5a18adb13e6\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.783868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-catalog-content\") pod \"9131d853-3eeb-47e2-b585-a5a18adb13e6\" (UID: \"9131d853-3eeb-47e2-b585-a5a18adb13e6\") " Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.784162 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-utilities" (OuterVolumeSpecName: "utilities") pod "9131d853-3eeb-47e2-b585-a5a18adb13e6" (UID: "9131d853-3eeb-47e2-b585-a5a18adb13e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.785003 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.793183 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9131d853-3eeb-47e2-b585-a5a18adb13e6-kube-api-access-6c6sh" (OuterVolumeSpecName: "kube-api-access-6c6sh") pod "9131d853-3eeb-47e2-b585-a5a18adb13e6" (UID: "9131d853-3eeb-47e2-b585-a5a18adb13e6"). InnerVolumeSpecName "kube-api-access-6c6sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.838561 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9131d853-3eeb-47e2-b585-a5a18adb13e6" (UID: "9131d853-3eeb-47e2-b585-a5a18adb13e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.887724 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c6sh\" (UniqueName: \"kubernetes.io/projected/9131d853-3eeb-47e2-b585-a5a18adb13e6-kube-api-access-6c6sh\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.887764 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9131d853-3eeb-47e2-b585-a5a18adb13e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.983472 4792 generic.go:334] "Generic (PLEG): container finished" podID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerID="aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc" exitCode=0 Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.983532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hx6lv" event={"ID":"9131d853-3eeb-47e2-b585-a5a18adb13e6","Type":"ContainerDied","Data":"aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc"} Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.983596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hx6lv" event={"ID":"9131d853-3eeb-47e2-b585-a5a18adb13e6","Type":"ContainerDied","Data":"b7eaf262d03810519d7386d1040aad3f0f00b07c84201c96cd340c2de97deac9"} Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.983617 4792 scope.go:117] "RemoveContainer" containerID="aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc" Mar 18 16:45:40 crc kubenswrapper[4792]: I0318 16:45:40.983557 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hx6lv" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.027786 4792 scope.go:117] "RemoveContainer" containerID="c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.032877 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hx6lv"] Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.044916 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hx6lv"] Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.055217 4792 scope.go:117] "RemoveContainer" containerID="e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.116496 4792 scope.go:117] "RemoveContainer" containerID="aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc" Mar 18 16:45:41 crc kubenswrapper[4792]: E0318 16:45:41.116996 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc\": container with ID starting with aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc not found: ID does not exist" containerID="aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.117039 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc"} err="failed to get container status \"aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc\": rpc error: code = NotFound desc = could not find container \"aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc\": container with ID starting with aebce6aa482c12bee2c671276f461a917d5fd9e043f8fdbfb56f08c5d9ba2abc not found: ID does not exist" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.117070 4792 scope.go:117] "RemoveContainer" containerID="c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20" Mar 18 16:45:41 crc kubenswrapper[4792]: E0318 16:45:41.118296 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20\": container with ID starting with c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20 not found: ID does not exist" containerID="c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.118326 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20"} err="failed to get container status \"c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20\": rpc error: code = NotFound desc = could not find container \"c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20\": container with ID starting with c6113042181108c9aa04bb39f9dc77d5ca1fd046579e529df79fd584d6031e20 not found: ID does not exist" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.118340 4792 scope.go:117] "RemoveContainer" containerID="e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a" Mar 18 16:45:41 crc kubenswrapper[4792]: E0318 16:45:41.118656 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a\": container with ID starting with e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a not found: ID does not exist" containerID="e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.118698 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a"} err="failed to get container status \"e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a\": rpc error: code = NotFound desc = could not find container \"e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a\": container with ID starting with e6d9255db3561b65a08e0b7339f017bc294e8f98721c77b996d382ea5e9cf17a not found: ID does not exist" Mar 18 16:45:41 crc kubenswrapper[4792]: I0318 16:45:41.878601 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" path="/var/lib/kubelet/pods/9131d853-3eeb-47e2-b585-a5a18adb13e6/volumes" Mar 18 16:45:51 crc kubenswrapper[4792]: I0318 16:45:51.270045 4792 scope.go:117] "RemoveContainer" containerID="c23569782ba6f840646c7d4a6e0f628778acab1a87490df2c6af2de2027322ac" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.155626 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564206-pttsw"] Mar 18 16:46:00 crc kubenswrapper[4792]: E0318 16:46:00.157190 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="extract-content" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.157213 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="extract-content" Mar 18 16:46:00 crc kubenswrapper[4792]: E0318 16:46:00.157254 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="extract-utilities" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.157262 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="extract-utilities" Mar 18 16:46:00 crc kubenswrapper[4792]: E0318 16:46:00.157283 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="registry-server" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.157289 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="registry-server" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.157556 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9131d853-3eeb-47e2-b585-a5a18adb13e6" containerName="registry-server" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.158620 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-pttsw" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.162136 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.162820 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.163617 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.178712 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-pttsw"] Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.277399 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q68xp\" (UniqueName: \"kubernetes.io/projected/8fdc705b-7914-4c02-91a2-ff2cd661ef81-kube-api-access-q68xp\") pod \"auto-csr-approver-29564206-pttsw\" (UID: \"8fdc705b-7914-4c02-91a2-ff2cd661ef81\") " pod="openshift-infra/auto-csr-approver-29564206-pttsw" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.322252 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.322630 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.380705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q68xp\" (UniqueName: \"kubernetes.io/projected/8fdc705b-7914-4c02-91a2-ff2cd661ef81-kube-api-access-q68xp\") pod \"auto-csr-approver-29564206-pttsw\" (UID: \"8fdc705b-7914-4c02-91a2-ff2cd661ef81\") " pod="openshift-infra/auto-csr-approver-29564206-pttsw" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.415742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q68xp\" (UniqueName: \"kubernetes.io/projected/8fdc705b-7914-4c02-91a2-ff2cd661ef81-kube-api-access-q68xp\") pod \"auto-csr-approver-29564206-pttsw\" (UID: \"8fdc705b-7914-4c02-91a2-ff2cd661ef81\") " pod="openshift-infra/auto-csr-approver-29564206-pttsw" Mar 18 16:46:00 crc kubenswrapper[4792]: I0318 16:46:00.484688 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-pttsw" Mar 18 16:46:01 crc kubenswrapper[4792]: I0318 16:46:01.033390 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-pttsw"] Mar 18 16:46:01 crc kubenswrapper[4792]: I0318 16:46:01.222544 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-pttsw" event={"ID":"8fdc705b-7914-4c02-91a2-ff2cd661ef81","Type":"ContainerStarted","Data":"70be8750b67652ad94f65dac680c4ad4531a6eccd72d085528e68b00664ac88b"} Mar 18 16:46:03 crc kubenswrapper[4792]: I0318 16:46:03.282676 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-pttsw" event={"ID":"8fdc705b-7914-4c02-91a2-ff2cd661ef81","Type":"ContainerStarted","Data":"07f2b1d1f0428562fc658f0397162e2828d3a7ed7ee32f33e207341c9be94a0b"} Mar 18 16:46:03 crc kubenswrapper[4792]: I0318 16:46:03.311213 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564206-pttsw" podStartSLOduration=2.14174685 podStartE2EDuration="3.311185876s" podCreationTimestamp="2026-03-18 16:46:00 +0000 UTC" firstStartedPulling="2026-03-18 16:46:01.037771775 +0000 UTC m=+4309.907100712" lastFinishedPulling="2026-03-18 16:46:02.207210801 +0000 UTC m=+4311.076539738" observedRunningTime="2026-03-18 16:46:03.300617178 +0000 UTC m=+4312.169946135" watchObservedRunningTime="2026-03-18 16:46:03.311185876 +0000 UTC m=+4312.180514813" Mar 18 16:46:04 crc kubenswrapper[4792]: I0318 16:46:04.298294 4792 generic.go:334] "Generic (PLEG): container finished" podID="8fdc705b-7914-4c02-91a2-ff2cd661ef81" containerID="07f2b1d1f0428562fc658f0397162e2828d3a7ed7ee32f33e207341c9be94a0b" exitCode=0 Mar 18 16:46:04 crc kubenswrapper[4792]: I0318 16:46:04.298406 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-pttsw" event={"ID":"8fdc705b-7914-4c02-91a2-ff2cd661ef81","Type":"ContainerDied","Data":"07f2b1d1f0428562fc658f0397162e2828d3a7ed7ee32f33e207341c9be94a0b"} Mar 18 16:46:05 crc kubenswrapper[4792]: I0318 16:46:05.778131 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-pttsw" Mar 18 16:46:05 crc kubenswrapper[4792]: I0318 16:46:05.864279 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q68xp\" (UniqueName: \"kubernetes.io/projected/8fdc705b-7914-4c02-91a2-ff2cd661ef81-kube-api-access-q68xp\") pod \"8fdc705b-7914-4c02-91a2-ff2cd661ef81\" (UID: \"8fdc705b-7914-4c02-91a2-ff2cd661ef81\") " Mar 18 16:46:05 crc kubenswrapper[4792]: I0318 16:46:05.873279 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdc705b-7914-4c02-91a2-ff2cd661ef81-kube-api-access-q68xp" (OuterVolumeSpecName: "kube-api-access-q68xp") pod "8fdc705b-7914-4c02-91a2-ff2cd661ef81" (UID: "8fdc705b-7914-4c02-91a2-ff2cd661ef81"). InnerVolumeSpecName "kube-api-access-q68xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:46:05 crc kubenswrapper[4792]: I0318 16:46:05.969502 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q68xp\" (UniqueName: \"kubernetes.io/projected/8fdc705b-7914-4c02-91a2-ff2cd661ef81-kube-api-access-q68xp\") on node \"crc\" DevicePath \"\"" Mar 18 16:46:06 crc kubenswrapper[4792]: I0318 16:46:06.351262 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564206-pttsw" event={"ID":"8fdc705b-7914-4c02-91a2-ff2cd661ef81","Type":"ContainerDied","Data":"70be8750b67652ad94f65dac680c4ad4531a6eccd72d085528e68b00664ac88b"} Mar 18 16:46:06 crc kubenswrapper[4792]: I0318 16:46:06.351774 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70be8750b67652ad94f65dac680c4ad4531a6eccd72d085528e68b00664ac88b" Mar 18 16:46:06 crc kubenswrapper[4792]: I0318 16:46:06.351881 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564206-pttsw" Mar 18 16:46:06 crc kubenswrapper[4792]: I0318 16:46:06.396626 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-x4dh8"] Mar 18 16:46:06 crc kubenswrapper[4792]: I0318 16:46:06.408379 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564200-x4dh8"] Mar 18 16:46:07 crc kubenswrapper[4792]: I0318 16:46:07.872645 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c690c36-4098-4e0b-a5c0-c386561348a4" path="/var/lib/kubelet/pods/8c690c36-4098-4e0b-a5c0-c386561348a4/volumes" Mar 18 16:46:30 crc kubenswrapper[4792]: I0318 16:46:30.322804 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:46:30 crc kubenswrapper[4792]: I0318 16:46:30.323456 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:46:51 crc kubenswrapper[4792]: I0318 16:46:51.775461 4792 scope.go:117] "RemoveContainer" containerID="7d99b79b7bdbe821959b17f37905b2462586afc95c2f276016ece458302d36c2" Mar 18 16:47:00 crc kubenswrapper[4792]: I0318 16:47:00.322427 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:47:00 crc kubenswrapper[4792]: I0318 16:47:00.324449 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:47:00 crc kubenswrapper[4792]: I0318 16:47:00.324504 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:47:00 crc kubenswrapper[4792]: I0318 16:47:00.325614 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:47:00 crc kubenswrapper[4792]: I0318 16:47:00.325694 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" gracePeriod=600 Mar 18 16:47:00 crc kubenswrapper[4792]: E0318 16:47:00.473098 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:47:01 crc kubenswrapper[4792]: I0318 16:47:01.038370 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" exitCode=0 Mar 18 16:47:01 crc kubenswrapper[4792]: I0318 16:47:01.038429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470"} Mar 18 16:47:01 crc kubenswrapper[4792]: I0318 16:47:01.038473 4792 scope.go:117] "RemoveContainer" containerID="be17e60f0ea37cec338c00c4f2c42e4506f01941ef95b24c1294a99602c8023c" Mar 18 16:47:01 crc kubenswrapper[4792]: I0318 16:47:01.039387 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:47:01 crc kubenswrapper[4792]: E0318 16:47:01.039768 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:47:11 crc kubenswrapper[4792]: I0318 16:47:11.862006 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:47:11 crc kubenswrapper[4792]: E0318 16:47:11.862906 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:47:25 crc kubenswrapper[4792]: I0318 16:47:25.855275 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:47:25 crc kubenswrapper[4792]: E0318 16:47:25.856042 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:47:36 crc kubenswrapper[4792]: I0318 16:47:36.855211 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:47:36 crc kubenswrapper[4792]: E0318 16:47:36.856125 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:47:47 crc kubenswrapper[4792]: I0318 16:47:47.855048 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:47:47 crc kubenswrapper[4792]: E0318 16:47:47.856321 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.022077 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:48:00 crc kubenswrapper[4792]: E0318 16:48:00.023072 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.152794 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564208-j6b98"] Mar 18 16:48:00 crc kubenswrapper[4792]: E0318 16:48:00.154458 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdc705b-7914-4c02-91a2-ff2cd661ef81" containerName="oc" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.154485 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdc705b-7914-4c02-91a2-ff2cd661ef81" containerName="oc" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.154845 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdc705b-7914-4c02-91a2-ff2cd661ef81" containerName="oc" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.156723 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-j6b98" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.159081 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.159385 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.160343 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.166360 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-j6b98"] Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.231816 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn2zr\" (UniqueName: \"kubernetes.io/projected/11f55ae7-0638-41c2-833d-b71e63370404-kube-api-access-bn2zr\") pod \"auto-csr-approver-29564208-j6b98\" (UID: \"11f55ae7-0638-41c2-833d-b71e63370404\") " pod="openshift-infra/auto-csr-approver-29564208-j6b98" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.334006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn2zr\" (UniqueName: \"kubernetes.io/projected/11f55ae7-0638-41c2-833d-b71e63370404-kube-api-access-bn2zr\") pod \"auto-csr-approver-29564208-j6b98\" (UID: \"11f55ae7-0638-41c2-833d-b71e63370404\") " pod="openshift-infra/auto-csr-approver-29564208-j6b98" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.354293 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn2zr\" (UniqueName: \"kubernetes.io/projected/11f55ae7-0638-41c2-833d-b71e63370404-kube-api-access-bn2zr\") pod \"auto-csr-approver-29564208-j6b98\" (UID: \"11f55ae7-0638-41c2-833d-b71e63370404\") " pod="openshift-infra/auto-csr-approver-29564208-j6b98" Mar 18 16:48:00 crc kubenswrapper[4792]: I0318 16:48:00.520338 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-j6b98" Mar 18 16:48:01 crc kubenswrapper[4792]: I0318 16:48:01.001463 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:48:01 crc kubenswrapper[4792]: I0318 16:48:01.002440 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-j6b98"] Mar 18 16:48:01 crc kubenswrapper[4792]: I0318 16:48:01.711506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-j6b98" event={"ID":"11f55ae7-0638-41c2-833d-b71e63370404","Type":"ContainerStarted","Data":"ffa7ce6eaf6a4db925f807d5cba02cd45d3d291046856d4181527ccfc854486f"} Mar 18 16:48:02 crc kubenswrapper[4792]: I0318 16:48:02.723075 4792 generic.go:334] "Generic (PLEG): container finished" podID="11f55ae7-0638-41c2-833d-b71e63370404" containerID="29b35c5c9e98c2668cc597b097a617f64a4f7b5ba5ea2e3fea96efbdc844279b" exitCode=0 Mar 18 16:48:02 crc kubenswrapper[4792]: I0318 16:48:02.723229 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-j6b98" event={"ID":"11f55ae7-0638-41c2-833d-b71e63370404","Type":"ContainerDied","Data":"29b35c5c9e98c2668cc597b097a617f64a4f7b5ba5ea2e3fea96efbdc844279b"} Mar 18 16:48:04 crc kubenswrapper[4792]: I0318 16:48:04.206535 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-j6b98" Mar 18 16:48:04 crc kubenswrapper[4792]: I0318 16:48:04.279278 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn2zr\" (UniqueName: \"kubernetes.io/projected/11f55ae7-0638-41c2-833d-b71e63370404-kube-api-access-bn2zr\") pod \"11f55ae7-0638-41c2-833d-b71e63370404\" (UID: \"11f55ae7-0638-41c2-833d-b71e63370404\") " Mar 18 16:48:04 crc kubenswrapper[4792]: I0318 16:48:04.287287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f55ae7-0638-41c2-833d-b71e63370404-kube-api-access-bn2zr" (OuterVolumeSpecName: "kube-api-access-bn2zr") pod "11f55ae7-0638-41c2-833d-b71e63370404" (UID: "11f55ae7-0638-41c2-833d-b71e63370404"). InnerVolumeSpecName "kube-api-access-bn2zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:48:04 crc kubenswrapper[4792]: I0318 16:48:04.383913 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn2zr\" (UniqueName: \"kubernetes.io/projected/11f55ae7-0638-41c2-833d-b71e63370404-kube-api-access-bn2zr\") on node \"crc\" DevicePath \"\"" Mar 18 16:48:04 crc kubenswrapper[4792]: I0318 16:48:04.745814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564208-j6b98" event={"ID":"11f55ae7-0638-41c2-833d-b71e63370404","Type":"ContainerDied","Data":"ffa7ce6eaf6a4db925f807d5cba02cd45d3d291046856d4181527ccfc854486f"} Mar 18 16:48:04 crc kubenswrapper[4792]: I0318 16:48:04.745867 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa7ce6eaf6a4db925f807d5cba02cd45d3d291046856d4181527ccfc854486f" Mar 18 16:48:04 crc kubenswrapper[4792]: I0318 16:48:04.745884 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564208-j6b98" Mar 18 16:48:05 crc kubenswrapper[4792]: I0318 16:48:05.281755 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-4256b"] Mar 18 16:48:05 crc kubenswrapper[4792]: I0318 16:48:05.296632 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564202-4256b"] Mar 18 16:48:05 crc kubenswrapper[4792]: I0318 16:48:05.870040 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8a5198-8500-4175-b58c-f7849d808c26" path="/var/lib/kubelet/pods/ef8a5198-8500-4175-b58c-f7849d808c26/volumes" Mar 18 16:48:10 crc kubenswrapper[4792]: I0318 16:48:10.855751 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:48:10 crc kubenswrapper[4792]: E0318 16:48:10.857170 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:48:24 crc kubenswrapper[4792]: I0318 16:48:24.854034 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:48:24 crc kubenswrapper[4792]: E0318 16:48:24.854873 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.729445 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n9j97"] Mar 18 16:48:28 crc kubenswrapper[4792]: E0318 16:48:28.730261 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f55ae7-0638-41c2-833d-b71e63370404" containerName="oc" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.730276 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f55ae7-0638-41c2-833d-b71e63370404" containerName="oc" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.730519 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f55ae7-0638-41c2-833d-b71e63370404" containerName="oc" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.732719 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.753416 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9j97"] Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.811147 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-utilities\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.811241 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jckrc\" (UniqueName: \"kubernetes.io/projected/bb790bb0-4ee5-4d44-baac-1e40aba0d591-kube-api-access-jckrc\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.811378 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-catalog-content\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.914801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-catalog-content\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.915310 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-utilities\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.915433 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jckrc\" (UniqueName: \"kubernetes.io/projected/bb790bb0-4ee5-4d44-baac-1e40aba0d591-kube-api-access-jckrc\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.915537 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-catalog-content\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.915787 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-utilities\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:28 crc kubenswrapper[4792]: I0318 16:48:28.941733 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jckrc\" (UniqueName: \"kubernetes.io/projected/bb790bb0-4ee5-4d44-baac-1e40aba0d591-kube-api-access-jckrc\") pod \"certified-operators-n9j97\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:29 crc kubenswrapper[4792]: I0318 16:48:29.055245 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:29 crc kubenswrapper[4792]: I0318 16:48:29.598591 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9j97"] Mar 18 16:48:30 crc kubenswrapper[4792]: I0318 16:48:30.018458 4792 generic.go:334] "Generic (PLEG): container finished" podID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerID="350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528" exitCode=0 Mar 18 16:48:30 crc kubenswrapper[4792]: I0318 16:48:30.018522 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9j97" event={"ID":"bb790bb0-4ee5-4d44-baac-1e40aba0d591","Type":"ContainerDied","Data":"350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528"} Mar 18 16:48:30 crc kubenswrapper[4792]: I0318 16:48:30.018816 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9j97" event={"ID":"bb790bb0-4ee5-4d44-baac-1e40aba0d591","Type":"ContainerStarted","Data":"a73b99c2a4f9493ca9d331441ae0800a43544a772f46225af95dd345fa1bae45"} Mar 18 16:48:31 crc kubenswrapper[4792]: I0318 16:48:31.053533 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9j97" event={"ID":"bb790bb0-4ee5-4d44-baac-1e40aba0d591","Type":"ContainerStarted","Data":"4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9"} Mar 18 16:48:33 crc kubenswrapper[4792]: I0318 16:48:33.078891 4792 generic.go:334] "Generic (PLEG): container finished" podID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerID="4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9" exitCode=0 Mar 18 16:48:33 crc kubenswrapper[4792]: I0318 16:48:33.078930 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9j97" event={"ID":"bb790bb0-4ee5-4d44-baac-1e40aba0d591","Type":"ContainerDied","Data":"4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9"} Mar 18 16:48:35 crc kubenswrapper[4792]: I0318 16:48:35.103022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9j97" event={"ID":"bb790bb0-4ee5-4d44-baac-1e40aba0d591","Type":"ContainerStarted","Data":"7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8"} Mar 18 16:48:35 crc kubenswrapper[4792]: I0318 16:48:35.131516 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n9j97" podStartSLOduration=3.651813171 podStartE2EDuration="7.131496133s" podCreationTimestamp="2026-03-18 16:48:28 +0000 UTC" firstStartedPulling="2026-03-18 16:48:30.020908087 +0000 UTC m=+4458.890237044" lastFinishedPulling="2026-03-18 16:48:33.500591069 +0000 UTC m=+4462.369920006" observedRunningTime="2026-03-18 16:48:35.119995006 +0000 UTC m=+4463.989323953" watchObservedRunningTime="2026-03-18 16:48:35.131496133 +0000 UTC m=+4464.000825070" Mar 18 16:48:36 crc kubenswrapper[4792]: I0318 16:48:36.854167 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:48:36 crc kubenswrapper[4792]: E0318 16:48:36.854865 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:48:39 crc kubenswrapper[4792]: I0318 16:48:39.056866 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:39 crc kubenswrapper[4792]: I0318 16:48:39.057428 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:39 crc kubenswrapper[4792]: I0318 16:48:39.108036 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:39 crc kubenswrapper[4792]: I0318 16:48:39.204946 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:39 crc kubenswrapper[4792]: I0318 16:48:39.356268 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9j97"] Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.164742 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n9j97" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="registry-server" containerID="cri-o://7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8" gracePeriod=2 Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.694595 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.858334 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-catalog-content\") pod \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.859036 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jckrc\" (UniqueName: \"kubernetes.io/projected/bb790bb0-4ee5-4d44-baac-1e40aba0d591-kube-api-access-jckrc\") pod \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.859348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-utilities\") pod \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\" (UID: \"bb790bb0-4ee5-4d44-baac-1e40aba0d591\") " Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.860279 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-utilities" (OuterVolumeSpecName: "utilities") pod "bb790bb0-4ee5-4d44-baac-1e40aba0d591" (UID: "bb790bb0-4ee5-4d44-baac-1e40aba0d591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.867494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb790bb0-4ee5-4d44-baac-1e40aba0d591-kube-api-access-jckrc" (OuterVolumeSpecName: "kube-api-access-jckrc") pod "bb790bb0-4ee5-4d44-baac-1e40aba0d591" (UID: "bb790bb0-4ee5-4d44-baac-1e40aba0d591"). InnerVolumeSpecName "kube-api-access-jckrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.939732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb790bb0-4ee5-4d44-baac-1e40aba0d591" (UID: "bb790bb0-4ee5-4d44-baac-1e40aba0d591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.962873 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.962928 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jckrc\" (UniqueName: \"kubernetes.io/projected/bb790bb0-4ee5-4d44-baac-1e40aba0d591-kube-api-access-jckrc\") on node \"crc\" DevicePath \"\"" Mar 18 16:48:41 crc kubenswrapper[4792]: I0318 16:48:41.962945 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb790bb0-4ee5-4d44-baac-1e40aba0d591-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.177438 4792 generic.go:334] "Generic (PLEG): container finished" podID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerID="7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8" exitCode=0 Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.177549 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9j97" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.177563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9j97" event={"ID":"bb790bb0-4ee5-4d44-baac-1e40aba0d591","Type":"ContainerDied","Data":"7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8"} Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.177715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9j97" event={"ID":"bb790bb0-4ee5-4d44-baac-1e40aba0d591","Type":"ContainerDied","Data":"a73b99c2a4f9493ca9d331441ae0800a43544a772f46225af95dd345fa1bae45"} Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.177744 4792 scope.go:117] "RemoveContainer" containerID="7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.199754 4792 scope.go:117] "RemoveContainer" containerID="4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.223667 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9j97"] Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.237402 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n9j97"] Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.244056 4792 scope.go:117] "RemoveContainer" containerID="350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.285998 4792 scope.go:117] "RemoveContainer" containerID="7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8" Mar 18 16:48:42 crc kubenswrapper[4792]: E0318 16:48:42.286448 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8\": container with ID starting with 7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8 not found: ID does not exist" containerID="7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.286480 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8"} err="failed to get container status \"7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8\": rpc error: code = NotFound desc = could not find container \"7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8\": container with ID starting with 7477a79064a583a5bc1d2241aa229612d546dd1e5eb43c57516382aab20be5b8 not found: ID does not exist" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.286507 4792 scope.go:117] "RemoveContainer" containerID="4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9" Mar 18 16:48:42 crc kubenswrapper[4792]: E0318 16:48:42.287222 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9\": container with ID starting with 4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9 not found: ID does not exist" containerID="4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.287254 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9"} err="failed to get container status \"4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9\": rpc error: code = NotFound desc = could not find container \"4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9\": container with ID starting with 4cbe14f8792284cdcfcbc02b230bca626e17f2449a0cbc4e47a083ee14f4dfd9 not found: ID does not exist" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.287270 4792 scope.go:117] "RemoveContainer" containerID="350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528" Mar 18 16:48:42 crc kubenswrapper[4792]: E0318 16:48:42.287584 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528\": container with ID starting with 350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528 not found: ID does not exist" containerID="350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528" Mar 18 16:48:42 crc kubenswrapper[4792]: I0318 16:48:42.287605 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528"} err="failed to get container status \"350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528\": rpc error: code = NotFound desc = could not find container \"350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528\": container with ID starting with 350585906a62ba5b8ba2c8012f47fd473f05e8cba1a98547d4c2f67912a99528 not found: ID does not exist" Mar 18 16:48:43 crc kubenswrapper[4792]: I0318 16:48:43.868427 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" path="/var/lib/kubelet/pods/bb790bb0-4ee5-4d44-baac-1e40aba0d591/volumes" Mar 18 16:48:49 crc kubenswrapper[4792]: I0318 16:48:49.855421 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:48:49 crc kubenswrapper[4792]: E0318 16:48:49.856274 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:48:51 crc kubenswrapper[4792]: I0318 16:48:51.888065 4792 scope.go:117] "RemoveContainer" containerID="99458f12a7ba6e80f728170a7738a7a7b5a41d54d3d46204852457a696b1ddbe" Mar 18 16:49:02 crc kubenswrapper[4792]: I0318 16:49:02.854212 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:49:02 crc kubenswrapper[4792]: E0318 16:49:02.855071 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:49:16 crc kubenswrapper[4792]: I0318 16:49:16.855440 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:49:16 crc kubenswrapper[4792]: E0318 16:49:16.856540 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:49:30 crc kubenswrapper[4792]: I0318 16:49:30.855383 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:49:30 crc kubenswrapper[4792]: E0318 16:49:30.856352 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:49:41 crc kubenswrapper[4792]: I0318 16:49:41.868915 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:49:41 crc kubenswrapper[4792]: E0318 16:49:41.869766 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:49:54 crc kubenswrapper[4792]: I0318 16:49:54.854851 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:49:54 crc kubenswrapper[4792]: E0318 16:49:54.855693 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.144609 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564210-xg7wz"] Mar 18 16:50:00 crc kubenswrapper[4792]: E0318 16:50:00.145698 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="registry-server" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.145713 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="registry-server" Mar 18 16:50:00 crc kubenswrapper[4792]: E0318 16:50:00.145740 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="extract-content" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.145748 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="extract-content" Mar 18 16:50:00 crc kubenswrapper[4792]: E0318 16:50:00.145756 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="extract-utilities" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.145762 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="extract-utilities" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.146033 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb790bb0-4ee5-4d44-baac-1e40aba0d591" containerName="registry-server" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.146929 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-xg7wz" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.150191 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.151141 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.151397 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.162131 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-xg7wz"] Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.320516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x28qj\" (UniqueName: \"kubernetes.io/projected/1aad12d5-33fc-4adf-8c9d-d603876c4557-kube-api-access-x28qj\") pod \"auto-csr-approver-29564210-xg7wz\" (UID: \"1aad12d5-33fc-4adf-8c9d-d603876c4557\") " pod="openshift-infra/auto-csr-approver-29564210-xg7wz" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.423378 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x28qj\" (UniqueName: \"kubernetes.io/projected/1aad12d5-33fc-4adf-8c9d-d603876c4557-kube-api-access-x28qj\") pod \"auto-csr-approver-29564210-xg7wz\" (UID: \"1aad12d5-33fc-4adf-8c9d-d603876c4557\") " pod="openshift-infra/auto-csr-approver-29564210-xg7wz" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.454703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x28qj\" (UniqueName: \"kubernetes.io/projected/1aad12d5-33fc-4adf-8c9d-d603876c4557-kube-api-access-x28qj\") pod \"auto-csr-approver-29564210-xg7wz\" (UID: \"1aad12d5-33fc-4adf-8c9d-d603876c4557\") " pod="openshift-infra/auto-csr-approver-29564210-xg7wz" Mar 18 16:50:00 crc kubenswrapper[4792]: I0318 16:50:00.472859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-xg7wz" Mar 18 16:50:01 crc kubenswrapper[4792]: W0318 16:50:01.026625 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aad12d5_33fc_4adf_8c9d_d603876c4557.slice/crio-428529302a32f1d9e1dcfe28a1f258011f7082176c71c2d5bd728c07f50f85f1 WatchSource:0}: Error finding container 428529302a32f1d9e1dcfe28a1f258011f7082176c71c2d5bd728c07f50f85f1: Status 404 returned error can't find the container with id 428529302a32f1d9e1dcfe28a1f258011f7082176c71c2d5bd728c07f50f85f1 Mar 18 16:50:01 crc kubenswrapper[4792]: I0318 16:50:01.027193 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-xg7wz"] Mar 18 16:50:01 crc kubenswrapper[4792]: I0318 16:50:01.093330 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-xg7wz" event={"ID":"1aad12d5-33fc-4adf-8c9d-d603876c4557","Type":"ContainerStarted","Data":"428529302a32f1d9e1dcfe28a1f258011f7082176c71c2d5bd728c07f50f85f1"} Mar 18 16:50:04 crc kubenswrapper[4792]: I0318 16:50:04.147543 4792 generic.go:334] "Generic (PLEG): container finished" podID="1aad12d5-33fc-4adf-8c9d-d603876c4557" containerID="2864eda9fc36dd02bbc834e8711adc58c0d759dca48c487361d34d75e6fd4708" exitCode=0 Mar 18 16:50:04 crc kubenswrapper[4792]: I0318 16:50:04.147626 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-xg7wz" event={"ID":"1aad12d5-33fc-4adf-8c9d-d603876c4557","Type":"ContainerDied","Data":"2864eda9fc36dd02bbc834e8711adc58c0d759dca48c487361d34d75e6fd4708"} Mar 18 16:50:05 crc kubenswrapper[4792]: I0318 16:50:05.691217 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-xg7wz" Mar 18 16:50:05 crc kubenswrapper[4792]: I0318 16:50:05.858714 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x28qj\" (UniqueName: \"kubernetes.io/projected/1aad12d5-33fc-4adf-8c9d-d603876c4557-kube-api-access-x28qj\") pod \"1aad12d5-33fc-4adf-8c9d-d603876c4557\" (UID: \"1aad12d5-33fc-4adf-8c9d-d603876c4557\") " Mar 18 16:50:05 crc kubenswrapper[4792]: I0318 16:50:05.865558 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aad12d5-33fc-4adf-8c9d-d603876c4557-kube-api-access-x28qj" (OuterVolumeSpecName: "kube-api-access-x28qj") pod "1aad12d5-33fc-4adf-8c9d-d603876c4557" (UID: "1aad12d5-33fc-4adf-8c9d-d603876c4557"). InnerVolumeSpecName "kube-api-access-x28qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:50:05 crc kubenswrapper[4792]: I0318 16:50:05.962679 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x28qj\" (UniqueName: \"kubernetes.io/projected/1aad12d5-33fc-4adf-8c9d-d603876c4557-kube-api-access-x28qj\") on node \"crc\" DevicePath \"\"" Mar 18 16:50:06 crc kubenswrapper[4792]: I0318 16:50:06.173568 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564210-xg7wz" event={"ID":"1aad12d5-33fc-4adf-8c9d-d603876c4557","Type":"ContainerDied","Data":"428529302a32f1d9e1dcfe28a1f258011f7082176c71c2d5bd728c07f50f85f1"} Mar 18 16:50:06 crc kubenswrapper[4792]: I0318 16:50:06.173962 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="428529302a32f1d9e1dcfe28a1f258011f7082176c71c2d5bd728c07f50f85f1" Mar 18 16:50:06 crc kubenswrapper[4792]: I0318 16:50:06.173632 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564210-xg7wz" Mar 18 16:50:06 crc kubenswrapper[4792]: I0318 16:50:06.755722 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-wv5dx"] Mar 18 16:50:06 crc kubenswrapper[4792]: I0318 16:50:06.768252 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564204-wv5dx"] Mar 18 16:50:07 crc kubenswrapper[4792]: I0318 16:50:07.869723 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1" path="/var/lib/kubelet/pods/bb6c6a9b-8c7f-4e6a-ab6a-7b6036f189c1/volumes" Mar 18 16:50:09 crc kubenswrapper[4792]: I0318 16:50:09.854964 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:50:09 crc kubenswrapper[4792]: E0318 16:50:09.855858 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:50:24 crc kubenswrapper[4792]: I0318 16:50:24.859267 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:50:24 crc kubenswrapper[4792]: E0318 16:50:24.860736 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:50:36 crc kubenswrapper[4792]: I0318 16:50:36.854439 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:50:36 crc kubenswrapper[4792]: E0318 16:50:36.855265 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.868363 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 16:50:47 crc kubenswrapper[4792]: E0318 16:50:47.869476 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aad12d5-33fc-4adf-8c9d-d603876c4557" containerName="oc" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.869494 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aad12d5-33fc-4adf-8c9d-d603876c4557" containerName="oc" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.869760 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aad12d5-33fc-4adf-8c9d-d603876c4557" containerName="oc" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.870827 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.875603 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.875689 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.876134 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.876477 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ns7mj" Mar 18 16:50:47 crc kubenswrapper[4792]: I0318 16:50:47.877578 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.062036 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-config-data\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.062090 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.062129 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.062461 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.062584 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.062840 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.063055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.063253 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.063316 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7z2\" (UniqueName: \"kubernetes.io/projected/11070522-2520-4564-a02c-3bd460ae33fe-kube-api-access-qq7z2\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.169819 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.170270 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.170361 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.170411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7z2\" (UniqueName: \"kubernetes.io/projected/11070522-2520-4564-a02c-3bd460ae33fe-kube-api-access-qq7z2\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.171198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-config-data\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.171250 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.171299 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.171389 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.171459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.172361 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.173953 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.174805 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.176036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-config-data\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.176253 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.177018 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.177205 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.178080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.207247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7z2\" (UniqueName: \"kubernetes.io/projected/11070522-2520-4564-a02c-3bd460ae33fe-kube-api-access-qq7z2\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.272204 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.504689 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 16:50:48 crc kubenswrapper[4792]: I0318 16:50:48.967924 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 16:50:49 crc kubenswrapper[4792]: I0318 16:50:49.655709 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"11070522-2520-4564-a02c-3bd460ae33fe","Type":"ContainerStarted","Data":"7c85f5866742680724d018262bbaf326c49125ccd67c4019f1f26979b44ea100"} Mar 18 16:50:50 crc kubenswrapper[4792]: I0318 16:50:50.855225 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:50:50 crc kubenswrapper[4792]: E0318 16:50:50.856023 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:50:52 crc kubenswrapper[4792]: I0318 16:50:52.021336 4792 scope.go:117] "RemoveContainer" containerID="648546e75c13dbabb334fb8764d6e17965b3291bba0ea35d3d004e6fe46c3f69" Mar 18 16:51:05 crc kubenswrapper[4792]: I0318 16:51:05.855946 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:51:05 crc kubenswrapper[4792]: E0318 16:51:05.857237 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:51:17 crc kubenswrapper[4792]: I0318 16:51:17.857104 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:51:17 crc kubenswrapper[4792]: E0318 16:51:17.858047 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:51:24 crc kubenswrapper[4792]: E0318 16:51:24.255281 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 18 16:51:24 crc kubenswrapper[4792]: E0318 16:51:24.258258 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qq7z2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(11070522-2520-4564-a02c-3bd460ae33fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 16:51:24 crc kubenswrapper[4792]: E0318 16:51:24.259462 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="11070522-2520-4564-a02c-3bd460ae33fe" Mar 18 16:51:25 crc kubenswrapper[4792]: E0318 16:51:25.071836 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="11070522-2520-4564-a02c-3bd460ae33fe" Mar 18 16:51:29 crc kubenswrapper[4792]: I0318 16:51:29.855530 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:51:29 crc kubenswrapper[4792]: E0318 16:51:29.856733 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:51:38 crc kubenswrapper[4792]: I0318 16:51:38.267802 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 16:51:40 crc kubenswrapper[4792]: I0318 16:51:40.237033 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"11070522-2520-4564-a02c-3bd460ae33fe","Type":"ContainerStarted","Data":"54e2c32417ddc412685148278d33c076ee0cb52cf58f22f52650de7e4e9a6cc8"} Mar 18 16:51:40 crc kubenswrapper[4792]: I0318 16:51:40.262794 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.972210459 podStartE2EDuration="54.262767536s" podCreationTimestamp="2026-03-18 16:50:46 +0000 UTC" firstStartedPulling="2026-03-18 16:50:48.974738626 +0000 UTC m=+4597.844067563" lastFinishedPulling="2026-03-18 16:51:38.265295703 +0000 UTC m=+4647.134624640" observedRunningTime="2026-03-18 16:51:40.255855896 +0000 UTC m=+4649.125184863" watchObservedRunningTime="2026-03-18 16:51:40.262767536 +0000 UTC m=+4649.132096473" Mar 18 16:51:40 crc kubenswrapper[4792]: I0318 16:51:40.855249 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:51:40 crc kubenswrapper[4792]: E0318 16:51:40.856211 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:51:51 crc kubenswrapper[4792]: I0318 16:51:51.863386 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:51:51 crc kubenswrapper[4792]: E0318 16:51:51.864252 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.150254 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564212-c99b6"] Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.155772 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-c99b6" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.159788 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.160564 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.162106 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.174555 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-c99b6"] Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.308084 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxpn\" (UniqueName: \"kubernetes.io/projected/64b4719d-5356-4698-bc46-11e4df4fe32a-kube-api-access-5wxpn\") pod \"auto-csr-approver-29564212-c99b6\" (UID: \"64b4719d-5356-4698-bc46-11e4df4fe32a\") " pod="openshift-infra/auto-csr-approver-29564212-c99b6" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.410479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxpn\" (UniqueName: \"kubernetes.io/projected/64b4719d-5356-4698-bc46-11e4df4fe32a-kube-api-access-5wxpn\") pod \"auto-csr-approver-29564212-c99b6\" (UID: \"64b4719d-5356-4698-bc46-11e4df4fe32a\") " pod="openshift-infra/auto-csr-approver-29564212-c99b6" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.432729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxpn\" (UniqueName: \"kubernetes.io/projected/64b4719d-5356-4698-bc46-11e4df4fe32a-kube-api-access-5wxpn\") pod \"auto-csr-approver-29564212-c99b6\" (UID: \"64b4719d-5356-4698-bc46-11e4df4fe32a\") " pod="openshift-infra/auto-csr-approver-29564212-c99b6" Mar 18 16:52:00 crc kubenswrapper[4792]: I0318 16:52:00.488419 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-c99b6" Mar 18 16:52:01 crc kubenswrapper[4792]: I0318 16:52:01.146343 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-c99b6"] Mar 18 16:52:01 crc kubenswrapper[4792]: I0318 16:52:01.526305 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-c99b6" event={"ID":"64b4719d-5356-4698-bc46-11e4df4fe32a","Type":"ContainerStarted","Data":"cc751b1311058a912b7315ea59f18eb62bac4341c979fe62bc4f19b1fc7df4ba"} Mar 18 16:52:03 crc kubenswrapper[4792]: I0318 16:52:03.550962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-c99b6" event={"ID":"64b4719d-5356-4698-bc46-11e4df4fe32a","Type":"ContainerStarted","Data":"64de97ad6257cea44b2f346c80ffd4de370764dadccc41936ad070b27ce98dda"} Mar 18 16:52:03 crc kubenswrapper[4792]: I0318 16:52:03.572284 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564212-c99b6" podStartSLOduration=1.9790352310000001 podStartE2EDuration="3.572262374s" podCreationTimestamp="2026-03-18 16:52:00 +0000 UTC" firstStartedPulling="2026-03-18 16:52:01.192450265 +0000 UTC m=+4670.061779202" lastFinishedPulling="2026-03-18 16:52:02.785677408 +0000 UTC m=+4671.655006345" observedRunningTime="2026-03-18 16:52:03.565986314 +0000 UTC m=+4672.435315251" watchObservedRunningTime="2026-03-18 16:52:03.572262374 +0000 UTC m=+4672.441591311" Mar 18 16:52:04 crc kubenswrapper[4792]: I0318 16:52:04.562191 4792 generic.go:334] "Generic (PLEG): container finished" podID="64b4719d-5356-4698-bc46-11e4df4fe32a" containerID="64de97ad6257cea44b2f346c80ffd4de370764dadccc41936ad070b27ce98dda" exitCode=0 Mar 18 16:52:04 crc kubenswrapper[4792]: I0318 16:52:04.562302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-c99b6" event={"ID":"64b4719d-5356-4698-bc46-11e4df4fe32a","Type":"ContainerDied","Data":"64de97ad6257cea44b2f346c80ffd4de370764dadccc41936ad070b27ce98dda"} Mar 18 16:52:05 crc kubenswrapper[4792]: I0318 16:52:05.855487 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.053288 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-c99b6" Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.180577 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wxpn\" (UniqueName: \"kubernetes.io/projected/64b4719d-5356-4698-bc46-11e4df4fe32a-kube-api-access-5wxpn\") pod \"64b4719d-5356-4698-bc46-11e4df4fe32a\" (UID: \"64b4719d-5356-4698-bc46-11e4df4fe32a\") " Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.189262 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b4719d-5356-4698-bc46-11e4df4fe32a-kube-api-access-5wxpn" (OuterVolumeSpecName: "kube-api-access-5wxpn") pod "64b4719d-5356-4698-bc46-11e4df4fe32a" (UID: "64b4719d-5356-4698-bc46-11e4df4fe32a"). InnerVolumeSpecName "kube-api-access-5wxpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.284653 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wxpn\" (UniqueName: \"kubernetes.io/projected/64b4719d-5356-4698-bc46-11e4df4fe32a-kube-api-access-5wxpn\") on node \"crc\" DevicePath \"\"" Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.588713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"a6f1c178a409c5f0355d650182975948a9c12e1883a5050caf7fccad5d287466"} Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.592071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564212-c99b6" event={"ID":"64b4719d-5356-4698-bc46-11e4df4fe32a","Type":"ContainerDied","Data":"cc751b1311058a912b7315ea59f18eb62bac4341c979fe62bc4f19b1fc7df4ba"} Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.592105 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc751b1311058a912b7315ea59f18eb62bac4341c979fe62bc4f19b1fc7df4ba" Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.592145 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564212-c99b6" Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.668717 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-pttsw"] Mar 18 16:52:06 crc kubenswrapper[4792]: I0318 16:52:06.687756 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564206-pttsw"] Mar 18 16:52:07 crc kubenswrapper[4792]: I0318 16:52:07.870072 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdc705b-7914-4c02-91a2-ff2cd661ef81" path="/var/lib/kubelet/pods/8fdc705b-7914-4c02-91a2-ff2cd661ef81/volumes" Mar 18 16:52:56 crc kubenswrapper[4792]: I0318 16:52:56.569262 4792 scope.go:117] "RemoveContainer" containerID="07f2b1d1f0428562fc658f0397162e2828d3a7ed7ee32f33e207341c9be94a0b" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.738243 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7g6m5"] Mar 18 16:53:44 crc kubenswrapper[4792]: E0318 16:53:44.766434 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b4719d-5356-4698-bc46-11e4df4fe32a" containerName="oc" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.766512 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b4719d-5356-4698-bc46-11e4df4fe32a" containerName="oc" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.768528 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b4719d-5356-4698-bc46-11e4df4fe32a" containerName="oc" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.785577 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.886257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-catalog-content\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.886782 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-utilities\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.886903 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5pq\" (UniqueName: \"kubernetes.io/projected/14189167-a619-4f76-8f74-f3cdf2ca44c9-kube-api-access-5p5pq\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.936114 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7g6m5"] Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.991017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-catalog-content\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.991296 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-utilities\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.991449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5pq\" (UniqueName: \"kubernetes.io/projected/14189167-a619-4f76-8f74-f3cdf2ca44c9-kube-api-access-5p5pq\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.994543 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-catalog-content\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:44 crc kubenswrapper[4792]: I0318 16:53:44.995589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-utilities\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:45 crc kubenswrapper[4792]: I0318 16:53:45.036026 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5pq\" (UniqueName: \"kubernetes.io/projected/14189167-a619-4f76-8f74-f3cdf2ca44c9-kube-api-access-5p5pq\") pod \"redhat-operators-7g6m5\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:45 crc kubenswrapper[4792]: I0318 16:53:45.123576 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:53:46 crc kubenswrapper[4792]: I0318 16:53:46.613162 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7g6m5"] Mar 18 16:53:46 crc kubenswrapper[4792]: I0318 16:53:46.805582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerStarted","Data":"52468e3a7aba70fb0f97497d636a87f28801aea629a1e346b65a468c096fcbb7"} Mar 18 16:53:47 crc kubenswrapper[4792]: I0318 16:53:47.822960 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerDied","Data":"69138f83fa99064086ef1724815f82fe445e5654c11da06d5a42704f396e8363"} Mar 18 16:53:47 crc kubenswrapper[4792]: I0318 16:53:47.825235 4792 generic.go:334] "Generic (PLEG): container finished" podID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerID="69138f83fa99064086ef1724815f82fe445e5654c11da06d5a42704f396e8363" exitCode=0 Mar 18 16:53:47 crc kubenswrapper[4792]: I0318 16:53:47.836896 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:53:49 crc kubenswrapper[4792]: I0318 16:53:49.893736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerStarted","Data":"6e6a7e0b00e09b384ab6e0aca0e568fcb2a04029b64e2eb5769206d8684d65a4"} Mar 18 16:53:56 crc kubenswrapper[4792]: I0318 16:53:56.970445 4792 generic.go:334] "Generic (PLEG): container finished" podID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerID="6e6a7e0b00e09b384ab6e0aca0e568fcb2a04029b64e2eb5769206d8684d65a4" exitCode=0 Mar 18 16:53:56 crc kubenswrapper[4792]: I0318 16:53:56.970545 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerDied","Data":"6e6a7e0b00e09b384ab6e0aca0e568fcb2a04029b64e2eb5769206d8684d65a4"} Mar 18 16:53:57 crc kubenswrapper[4792]: I0318 16:53:57.983572 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerStarted","Data":"26e076b2f4d3790d400522bb4eab3dc1b9002260a2b2ba29796761f2ec1ccf65"} Mar 18 16:53:58 crc kubenswrapper[4792]: I0318 16:53:58.008917 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7g6m5" podStartSLOduration=4.434896087 podStartE2EDuration="14.006518605s" podCreationTimestamp="2026-03-18 16:53:44 +0000 UTC" firstStartedPulling="2026-03-18 16:53:47.830269601 +0000 UTC m=+4776.699598548" lastFinishedPulling="2026-03-18 16:53:57.401892129 +0000 UTC m=+4786.271221066" observedRunningTime="2026-03-18 16:53:58.005543004 +0000 UTC m=+4786.874871961" watchObservedRunningTime="2026-03-18 16:53:58.006518605 +0000 UTC m=+4786.875847542" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.695081 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rfpd8"] Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.722115 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564214-nbg97"] Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.724453 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-nbg97" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.724470 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.728724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m58l\" (UniqueName: \"kubernetes.io/projected/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-kube-api-access-5m58l\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.729205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-catalog-content\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.729423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-utilities\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.752634 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.756531 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.758163 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.830740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czj4f\" (UniqueName: \"kubernetes.io/projected/61a2888c-51a4-4f58-8bab-671f62c3c29f-kube-api-access-czj4f\") pod \"auto-csr-approver-29564214-nbg97\" (UID: \"61a2888c-51a4-4f58-8bab-671f62c3c29f\") " pod="openshift-infra/auto-csr-approver-29564214-nbg97" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.830826 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-catalog-content\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.831208 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-utilities\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.831486 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m58l\" (UniqueName: \"kubernetes.io/projected/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-kube-api-access-5m58l\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.838336 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-catalog-content\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.838867 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-utilities\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.859932 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-nbg97"] Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.888863 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfpd8"] Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.914757 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m58l\" (UniqueName: \"kubernetes.io/projected/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-kube-api-access-5m58l\") pod \"redhat-marketplace-rfpd8\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.933126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czj4f\" (UniqueName: \"kubernetes.io/projected/61a2888c-51a4-4f58-8bab-671f62c3c29f-kube-api-access-czj4f\") pod \"auto-csr-approver-29564214-nbg97\" (UID: \"61a2888c-51a4-4f58-8bab-671f62c3c29f\") " pod="openshift-infra/auto-csr-approver-29564214-nbg97" Mar 18 16:54:00 crc kubenswrapper[4792]: I0318 16:54:00.974528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czj4f\" (UniqueName: \"kubernetes.io/projected/61a2888c-51a4-4f58-8bab-671f62c3c29f-kube-api-access-czj4f\") pod \"auto-csr-approver-29564214-nbg97\" (UID: \"61a2888c-51a4-4f58-8bab-671f62c3c29f\") " pod="openshift-infra/auto-csr-approver-29564214-nbg97" Mar 18 16:54:01 crc kubenswrapper[4792]: I0318 16:54:01.082786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:01 crc kubenswrapper[4792]: I0318 16:54:01.093839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-nbg97" Mar 18 16:54:02 crc kubenswrapper[4792]: I0318 16:54:02.744964 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-nbg97"] Mar 18 16:54:02 crc kubenswrapper[4792]: I0318 16:54:02.818606 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfpd8"] Mar 18 16:54:03 crc kubenswrapper[4792]: I0318 16:54:03.059818 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564214-nbg97" event={"ID":"61a2888c-51a4-4f58-8bab-671f62c3c29f","Type":"ContainerStarted","Data":"e9160f2d264a53ba6ff9d793c131b87d3bc91b6b7c696471dda8c09221c9be1e"} Mar 18 16:54:03 crc kubenswrapper[4792]: I0318 16:54:03.061571 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfpd8" event={"ID":"3cb12145-fbf5-43d5-8413-bf81eb70e3f6","Type":"ContainerStarted","Data":"95213080f912d3d4296d72b351f520df18698f20e038fca8e07c363fa0e3acef"} Mar 18 16:54:04 crc kubenswrapper[4792]: I0318 16:54:04.080172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfpd8" event={"ID":"3cb12145-fbf5-43d5-8413-bf81eb70e3f6","Type":"ContainerDied","Data":"341f3f72cff203564e73b8e2b2eda6861717198e718dac20e09ac134c558ce44"} Mar 18 16:54:04 crc kubenswrapper[4792]: I0318 16:54:04.080338 4792 generic.go:334] "Generic (PLEG): container finished" podID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerID="341f3f72cff203564e73b8e2b2eda6861717198e718dac20e09ac134c558ce44" exitCode=0 Mar 18 16:54:05 crc kubenswrapper[4792]: I0318 16:54:05.104117 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfpd8" event={"ID":"3cb12145-fbf5-43d5-8413-bf81eb70e3f6","Type":"ContainerStarted","Data":"e77ce8d1ede1f77ace1e704492c381d67e4e2aeec9c8557ea2301dbb6b0cd211"} Mar 18 16:54:05 crc kubenswrapper[4792]: I0318 16:54:05.126368 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:54:05 crc kubenswrapper[4792]: I0318 16:54:05.126424 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:54:06 crc kubenswrapper[4792]: I0318 16:54:06.115912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564214-nbg97" event={"ID":"61a2888c-51a4-4f58-8bab-671f62c3c29f","Type":"ContainerStarted","Data":"7f1ecc138b3c6b07033d7f5569f04e187ddebd4069346601b77f8a9aee767dd2"} Mar 18 16:54:06 crc kubenswrapper[4792]: I0318 16:54:06.185830 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564214-nbg97" podStartSLOduration=4.919003198 podStartE2EDuration="6.178795241s" podCreationTimestamp="2026-03-18 16:54:00 +0000 UTC" firstStartedPulling="2026-03-18 16:54:02.875960615 +0000 UTC m=+4791.745289562" lastFinishedPulling="2026-03-18 16:54:04.135752668 +0000 UTC m=+4793.005081605" observedRunningTime="2026-03-18 16:54:06.167703687 +0000 UTC m=+4795.037032624" watchObservedRunningTime="2026-03-18 16:54:06.178795241 +0000 UTC m=+4795.048124178" Mar 18 16:54:06 crc kubenswrapper[4792]: I0318 16:54:06.318678 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:06 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:06 crc kubenswrapper[4792]: > Mar 18 16:54:09 crc kubenswrapper[4792]: I0318 16:54:09.160333 4792 generic.go:334] "Generic (PLEG): container finished" podID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerID="e77ce8d1ede1f77ace1e704492c381d67e4e2aeec9c8557ea2301dbb6b0cd211" exitCode=0 Mar 18 16:54:09 crc kubenswrapper[4792]: I0318 16:54:09.160817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfpd8" event={"ID":"3cb12145-fbf5-43d5-8413-bf81eb70e3f6","Type":"ContainerDied","Data":"e77ce8d1ede1f77ace1e704492c381d67e4e2aeec9c8557ea2301dbb6b0cd211"} Mar 18 16:54:11 crc kubenswrapper[4792]: I0318 16:54:11.187390 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfpd8" event={"ID":"3cb12145-fbf5-43d5-8413-bf81eb70e3f6","Type":"ContainerStarted","Data":"edecd53cc19d803b55df92be90939338bf32b6ca4386a1916dd0cba529bb8f95"} Mar 18 16:54:11 crc kubenswrapper[4792]: I0318 16:54:11.190425 4792 generic.go:334] "Generic (PLEG): container finished" podID="61a2888c-51a4-4f58-8bab-671f62c3c29f" containerID="7f1ecc138b3c6b07033d7f5569f04e187ddebd4069346601b77f8a9aee767dd2" exitCode=0 Mar 18 16:54:11 crc kubenswrapper[4792]: I0318 16:54:11.190488 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564214-nbg97" event={"ID":"61a2888c-51a4-4f58-8bab-671f62c3c29f","Type":"ContainerDied","Data":"7f1ecc138b3c6b07033d7f5569f04e187ddebd4069346601b77f8a9aee767dd2"} Mar 18 16:54:11 crc kubenswrapper[4792]: I0318 16:54:11.222675 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rfpd8" podStartSLOduration=5.502625714 podStartE2EDuration="11.222652632s" podCreationTimestamp="2026-03-18 16:54:00 +0000 UTC" firstStartedPulling="2026-03-18 16:54:04.090161244 +0000 UTC m=+4792.959490181" lastFinishedPulling="2026-03-18 16:54:09.810188152 +0000 UTC m=+4798.679517099" observedRunningTime="2026-03-18 16:54:11.214473212 +0000 UTC m=+4800.083802179" watchObservedRunningTime="2026-03-18 16:54:11.222652632 +0000 UTC m=+4800.091981569" Mar 18 16:54:14 crc kubenswrapper[4792]: I0318 16:54:14.931323 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:14 crc kubenswrapper[4792]: I0318 16:54:14.936267 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:14 crc kubenswrapper[4792]: I0318 16:54:14.955491 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:14 crc kubenswrapper[4792]: I0318 16:54:14.955560 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:15 crc kubenswrapper[4792]: I0318 16:54:15.180278 4792 trace.go:236] Trace[1589156591]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-67xp4" (18-Mar-2026 16:54:13.618) (total time: 1546ms): Mar 18 16:54:15 crc kubenswrapper[4792]: Trace[1589156591]: [1.546639138s] [1.546639138s] END Mar 18 16:54:15 crc kubenswrapper[4792]: I0318 16:54:15.759190 4792 patch_prober.go:28] interesting pod/metrics-server-7db98db598-wxffp container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:15 crc kubenswrapper[4792]: I0318 16:54:15.759213 4792 patch_prober.go:28] interesting pod/metrics-server-7db98db598-wxffp container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:15 crc kubenswrapper[4792]: I0318 16:54:15.759558 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" podUID="de45de1b-91b3-41ff-9f73-95048b051745" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:15 crc kubenswrapper[4792]: I0318 16:54:15.759747 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" podUID="de45de1b-91b3-41ff-9f73-95048b051745" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:16 crc kubenswrapper[4792]: I0318 16:54:16.778643 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-nbg97" Mar 18 16:54:16 crc kubenswrapper[4792]: I0318 16:54:16.943113 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czj4f\" (UniqueName: \"kubernetes.io/projected/61a2888c-51a4-4f58-8bab-671f62c3c29f-kube-api-access-czj4f\") pod \"61a2888c-51a4-4f58-8bab-671f62c3c29f\" (UID: \"61a2888c-51a4-4f58-8bab-671f62c3c29f\") " Mar 18 16:54:17 crc kubenswrapper[4792]: I0318 16:54:17.025068 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a2888c-51a4-4f58-8bab-671f62c3c29f-kube-api-access-czj4f" (OuterVolumeSpecName: "kube-api-access-czj4f") pod "61a2888c-51a4-4f58-8bab-671f62c3c29f" (UID: "61a2888c-51a4-4f58-8bab-671f62c3c29f"). InnerVolumeSpecName "kube-api-access-czj4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:54:17 crc kubenswrapper[4792]: I0318 16:54:17.049981 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czj4f\" (UniqueName: \"kubernetes.io/projected/61a2888c-51a4-4f58-8bab-671f62c3c29f-kube-api-access-czj4f\") on node \"crc\" DevicePath \"\"" Mar 18 16:54:17 crc kubenswrapper[4792]: I0318 16:54:17.270207 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564214-nbg97" Mar 18 16:54:17 crc kubenswrapper[4792]: I0318 16:54:17.274717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564214-nbg97" event={"ID":"61a2888c-51a4-4f58-8bab-671f62c3c29f","Type":"ContainerDied","Data":"e9160f2d264a53ba6ff9d793c131b87d3bc91b6b7c696471dda8c09221c9be1e"} Mar 18 16:54:17 crc kubenswrapper[4792]: I0318 16:54:17.274808 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9160f2d264a53ba6ff9d793c131b87d3bc91b6b7c696471dda8c09221c9be1e" Mar 18 16:54:17 crc kubenswrapper[4792]: I0318 16:54:17.358380 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:17 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:17 crc kubenswrapper[4792]: > Mar 18 16:54:17 crc kubenswrapper[4792]: I0318 16:54:17.836246 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.490251 4792 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-pk6tp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.490642 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" podUID="57f5df54-714a-4cce-970a-7069ffd1cb63" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.656176 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.656251 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.656318 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.656381 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.931095 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.931732 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.932167 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.932222 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.932287 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:18 crc kubenswrapper[4792]: I0318 16:54:18.932306 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.925817 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.926289 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.925863 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.926716 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.959686 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.959755 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.959781 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:19 crc kubenswrapper[4792]: I0318 16:54:19.959842 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:20 crc kubenswrapper[4792]: I0318 16:54:20.162941 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" podUID="155eb4c3-aa63-4ec7-9824-1bef2045a68b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:20 crc kubenswrapper[4792]: I0318 16:54:20.163296 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" podUID="155eb4c3-aa63-4ec7-9824-1bef2045a68b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:20 crc kubenswrapper[4792]: I0318 16:54:20.834249 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" podUID="8cf0ba21-2c05-4e3d-8925-114487cc4998" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:20 crc kubenswrapper[4792]: I0318 16:54:20.834372 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" podUID="8cf0ba21-2c05-4e3d-8925-114487cc4998" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:20 crc kubenswrapper[4792]: I0318 16:54:20.931831 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:20 crc kubenswrapper[4792]: I0318 16:54:20.934207 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.084734 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.084829 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.874745 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.874778 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.874840 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.874888 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.962270 4792 patch_prober.go:28] interesting pod/oauth-openshift-86567d79f8-v9v86 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.962641 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.962281 4792 patch_prober.go:28] interesting pod/oauth-openshift-86567d79f8-v9v86 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:21 crc kubenswrapper[4792]: I0318 16:54:21.962904 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.177402 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-j6b98"] Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.186920 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564208-j6b98"] Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.847104 4792 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.847171 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959262 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959346 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959607 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959644 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959689 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959708 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959744 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:22 crc kubenswrapper[4792]: I0318 16:54:22.959760 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:23 crc kubenswrapper[4792]: I0318 16:54:23.707602 4792 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-x5w94 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:23 crc kubenswrapper[4792]: I0318 16:54:23.707925 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podUID="255ea945-6e83-4ead-b609-b47a6b5eaafa" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:23 crc kubenswrapper[4792]: I0318 16:54:23.707813 4792 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-x5w94 container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:23 crc kubenswrapper[4792]: I0318 16:54:23.708043 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podUID="255ea945-6e83-4ead-b609-b47a6b5eaafa" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:23 crc kubenswrapper[4792]: I0318 16:54:23.794115 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rfpd8" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:23 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:23 crc kubenswrapper[4792]: > Mar 18 16:54:23 crc kubenswrapper[4792]: I0318 16:54:23.880719 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f55ae7-0638-41c2-833d-b71e63370404" path="/var/lib/kubelet/pods/11f55ae7-0638-41c2-833d-b71e63370404/volumes" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.101897 4792 patch_prober.go:28] interesting pod/perses-operator-d9577b4dd-zfrmv container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.6:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.102005 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.102432 4792 patch_prober.go:28] interesting pod/perses-operator-d9577b4dd-zfrmv container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.102508 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.184246 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" podUID="aa2e6c5a-c94a-482a-aceb-156b1cc316d0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.184419 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-jtksk" podUID="aa2e6c5a-c94a-482a-aceb-156b1cc316d0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.268245 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" podUID="af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.268430 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" podUID="af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.435958 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" podUID="d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.436031 4792 patch_prober.go:28] interesting pod/console-855f5dc7f-qnkcz container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.436097 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-855f5dc7f-qnkcz" podUID="0e8a660f-2f46-41b3-badb-2d1164cea860" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.436116 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" podUID="79896742-17fd-4960-ae5b-af3c83550a4e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.435967 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" podUID="d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.436135 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-rpk5q" podUID="79896742-17fd-4960-ae5b-af3c83550a4e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.478480 4792 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-dfcx2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.478554 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" podUID="1c367fec-09d4-46fa-8900-0c508ced5de9" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.584190 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" podUID="dd73a890-f234-415f-b99a-685059be7d48" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.584435 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-kmpvr" podUID="dd73a890-f234-415f-b99a-685059be7d48" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.680922 4792 trace.go:236] Trace[2064747479]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (18-Mar-2026 16:54:23.257) (total time: 1412ms): Mar 18 16:54:24 crc kubenswrapper[4792]: Trace[2064747479]: [1.412136399s] [1.412136399s] END Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.680934 4792 trace.go:236] Trace[838177801]: "Calculate volume metrics of wal for pod openshift-logging/logging-loki-ingester-0" (18-Mar-2026 16:54:23.265) (total time: 1402ms): Mar 18 16:54:24 crc kubenswrapper[4792]: Trace[838177801]: [1.402450221s] [1.402450221s] END Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.680956 4792 trace.go:236] Trace[1667559595]: "Calculate volume metrics of glance for pod openstack/glance-default-internal-api-0" (18-Mar-2026 16:54:21.872) (total time: 2800ms): Mar 18 16:54:24 crc kubenswrapper[4792]: Trace[1667559595]: [2.800546462s] [2.800546462s] END Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.712272 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" podUID="abc215c2-57eb-4c7a-b19d-0ed3ccd67001" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.712318 4792 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-86w5q container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.712347 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" podUID="f9dbb2aa-f06a-431d-b181-29315e9170cb" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.712282 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" podUID="a518542e-e1c4-4754-9031-d3f1571abb27" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.712651 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" podUID="a518542e-e1c4-4754-9031-d3f1571abb27" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.767274 4792 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-gqm44 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.767356 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" podUID="8927cd79-8eff-4f53-a676-782cbb366e9c" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.887942 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" podUID="a1327184-da65-478d-b7a7-15d0daa3ca95" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.888356 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-l27l2" podUID="a1327184-da65-478d-b7a7-15d0daa3ca95" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.926180 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.926237 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.926285 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.926328 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.955156 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.955223 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.955547 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:24 crc kubenswrapper[4792]: I0318 16:54:24.955594 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:25 crc kubenswrapper[4792]: I0318 16:54:25.076847 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" podUID="5e4dd350-9a5b-4626-8b3d-6b9c097b4be1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:25 crc kubenswrapper[4792]: I0318 16:54:25.160359 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" podUID="5e4dd350-9a5b-4626-8b3d-6b9c097b4be1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:25 crc kubenswrapper[4792]: I0318 16:54:25.160385 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" podUID="05dba0ab-e659-4e0c-8713-4eebeca6edba" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:25 crc kubenswrapper[4792]: I0318 16:54:25.160866 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" podUID="05dba0ab-e659-4e0c-8713-4eebeca6edba" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:25 crc kubenswrapper[4792]: I0318 16:54:25.585242 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" podUID="7d4badb4-1388-47c8-aed9-f8478388af41" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.43:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:25 crc kubenswrapper[4792]: I0318 16:54:25.644836 4792 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:25 crc kubenswrapper[4792]: I0318 16:54:25.644920 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c6c59cfd-2add-4b4e-81c1-bacc77deae06" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:26 crc kubenswrapper[4792]: I0318 16:54:26.136170 4792 patch_prober.go:28] interesting pod/monitoring-plugin-69764bd9c7-ntlfk container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:26 crc kubenswrapper[4792]: I0318 16:54:26.136405 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" podUID="7bb8e68c-2183-4f5a-88a3-8c274d017247" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:26 crc kubenswrapper[4792]: I0318 16:54:26.208891 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="fb295773-c070-4d90-b351-cac7e8fa1017" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:26 crc kubenswrapper[4792]: I0318 16:54:26.209024 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="fb295773-c070-4d90-b351-cac7e8fa1017" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:27 crc kubenswrapper[4792]: I0318 16:54:27.218178 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" podUID="8bdbd945-a92a-471b-8a37-c999fe503caa" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:27 crc kubenswrapper[4792]: I0318 16:54:27.218781 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" podUID="8bdbd945-a92a-471b-8a37-c999fe503caa" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:27 crc kubenswrapper[4792]: I0318 16:54:27.918171 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:27 crc kubenswrapper[4792]: I0318 16:54:27.918237 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:27 crc kubenswrapper[4792]: I0318 16:54:27.918632 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:27 crc kubenswrapper[4792]: I0318 16:54:27.936500 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ec93232a-54d0-42a4-a659-ed6fc86913c6" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.000244 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-9k68t" podUID="260602fc-bedf-40ec-92e7-a96e3ee009f0" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.000493 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-9k68t" podUID="260602fc-bedf-40ec-92e7-a96e3ee009f0" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.416806 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" podUID="423d82c6-fd0b-4cb5-8ff2-501f479a9a73" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.458232 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" podUID="423d82c6-fd0b-4cb5-8ff2-501f479a9a73" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.458305 4792 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-pk6tp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.458414 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" podUID="57f5df54-714a-4cce-970a-7069ffd1cb63" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.657478 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.657571 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.657574 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.657624 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.670924 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-gkmkk" podUID="cc92711e-be7a-4025-9077-cac9e5bc7df8" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:28 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:28 crc kubenswrapper[4792]: > Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.671056 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-gkmkk" podUID="cc92711e-be7a-4025-9077-cac9e5bc7df8" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:28 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:28 crc kubenswrapper[4792]: > Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.863491 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Liveness probe status=failure output="" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.863502 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Readiness probe status=failure output="" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.892363 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.892382 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.892434 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.892445 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.931302 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.931315 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.974233 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.974334 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.974832 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.974924 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.991510 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xmfwt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.991583 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" podUID="f75c6f2b-0965-4e4d-9445-dc65d69c970b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.991609 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xmfwt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:28 crc kubenswrapper[4792]: I0318 16:54:28.991727 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" podUID="f75c6f2b-0965-4e4d-9445-dc65d69c970b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.055817 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:29 crc kubenswrapper[4792]: > Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.063339 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:29 crc kubenswrapper[4792]: > Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.064059 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6lntk" podUID="1f303ba2-d191-4ad6-a474-de409ea5475b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:29 crc kubenswrapper[4792]: > Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.064805 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-kzm75" podUID="dcf562be-1a5c-41e2-9355-706b833cb56e" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:29 crc kubenswrapper[4792]: > Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.067448 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:29 crc kubenswrapper[4792]: > Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.067699 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-6lntk" podUID="1f303ba2-d191-4ad6-a474-de409ea5475b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:29 crc kubenswrapper[4792]: > Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.073557 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-kzm75" podUID="dcf562be-1a5c-41e2-9355-706b833cb56e" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:29 crc kubenswrapper[4792]: > Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.457858 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-kgc9b" podUID="ac5ba665-4ead-4469-9d1c-c777bf26d579" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.458019 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-kgc9b" podUID="ac5ba665-4ead-4469-9d1c-c777bf26d579" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.502653 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-d6vrl container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.502736 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" podUID="e0c8715a-4133-49bd-b48f-12377582b8ce" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.502744 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-d6vrl container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.502812 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" podUID="e0c8715a-4133-49bd-b48f-12377582b8ce" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.925661 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.926084 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.925737 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.926379 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.938136 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.938318 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.954550 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.954626 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.954680 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:29 crc kubenswrapper[4792]: I0318 16:54:29.954708 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.146348 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" podUID="155eb4c3-aa63-4ec7-9824-1bef2045a68b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.322421 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.322498 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.627461 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" podUID="7d4badb4-1388-47c8-aed9-f8478388af41" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.43:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.628007 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-dbbd4" podUID="7d4badb4-1388-47c8-aed9-f8478388af41" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.43:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.793317 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" podUID="8cf0ba21-2c05-4e3d-8925-114487cc4998" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.930526 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:30 crc kubenswrapper[4792]: I0318 16:54:30.931125 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.070987 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-zw8n7" podUID="e5ef8d1c-3435-4dcb-8397-2314c8795c3b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:31 crc kubenswrapper[4792]: timeout: health rpc did not complete within 1s Mar 18 16:54:31 crc kubenswrapper[4792]: > Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.070983 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-zw8n7" podUID="e5ef8d1c-3435-4dcb-8397-2314c8795c3b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:31 crc kubenswrapper[4792]: timeout: health rpc did not complete within 1s Mar 18 16:54:31 crc kubenswrapper[4792]: > Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.208085 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="fb295773-c070-4d90-b351-cac7e8fa1017" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.208183 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="fb295773-c070-4d90-b351-cac7e8fa1017" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.214166 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" podUID="14667803-000a-4186-8eb1-da78ce4812a0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.214189 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" podUID="407238a6-a2e5-420c-801b-8a4329eebadd" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.876245 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.876321 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.876268 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.877947 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.880696 4792 patch_prober.go:28] interesting pod/oauth-openshift-86567d79f8-v9v86 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.880768 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.880850 4792 patch_prober.go:28] interesting pod/oauth-openshift-86567d79f8-v9v86 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.880890 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:31 crc kubenswrapper[4792]: I0318 16:54:31.992746 4792 trace.go:236] Trace[905039022]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (18-Mar-2026 16:54:29.013) (total time: 2975ms): Mar 18 16:54:31 crc kubenswrapper[4792]: Trace[905039022]: [2.975885063s] [2.975885063s] END Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.174231 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rfpd8" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:32 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:32 crc kubenswrapper[4792]: > Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.595176 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-d6vrl container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.595207 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-d6vrl container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.595241 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" podUID="e0c8715a-4133-49bd-b48f-12377582b8ce" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.595241 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d6vrl" podUID="e0c8715a-4133-49bd-b48f-12377582b8ce" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.846808 4792 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.847217 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.888481 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.888550 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.889176 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.889236 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.905027 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.905103 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.905137 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": context deadline exceeded" start-of-body= Mar 18 16:54:32 crc kubenswrapper[4792]: I0318 16:54:32.905203 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": context deadline exceeded" Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.286951 4792 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-rlx82 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.79:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.287333 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rlx82" podUID="7af72a3d-98a7-4a83-affa-3d382184fc59" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.79:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.707259 4792 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-x5w94 container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.707333 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podUID="255ea945-6e83-4ead-b609-b47a6b5eaafa" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.707427 4792 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-x5w94 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.707450 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-x5w94" podUID="255ea945-6e83-4ead-b609-b47a6b5eaafa" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.8:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.934512 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ec93232a-54d0-42a4-a659-ed6fc86913c6" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.934792 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 16:54:33 crc kubenswrapper[4792]: I0318 16:54:33.938078 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.055244 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-tls5q" podUID="6ccc988b-8909-4e90-b016-c94a1deb2de7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.096232 4792 patch_prober.go:28] interesting pod/perses-operator-d9577b4dd-zfrmv container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.6:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.096309 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.288230 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-26gf6" podUID="55d5f156-656e-4e2f-b368-e841124084d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.288262 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" podUID="d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.330442 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-xzpbw" podUID="3692a84a-23dc-4b6c-9c20-d97bd0e285d8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.416981 4792 patch_prober.go:28] interesting pod/console-855f5dc7f-qnkcz container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.417293 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-855f5dc7f-qnkcz" podUID="0e8a660f-2f46-41b3-badb-2d1164cea860" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.417024 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-mf8vn" podUID="dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.477435 4792 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-dfcx2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.477509 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" podUID="1c367fec-09d4-46fa-8900-0c508ced5de9" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.596244 4792 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7c8cdd9f9f-lv5d4 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.596300 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-n4j9l" podUID="65722e7d-1557-437c-ae5c-383082933c8c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.596588 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" podUID="2c438c99-c0c4-43ec-a5e7-33a18425e63f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.637355 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-hpr6x" podUID="abc215c2-57eb-4c7a-b19d-0ed3ccd67001" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.678127 4792 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-86w5q container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.678375 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" podUID="f9dbb2aa-f06a-431d-b181-29315e9170cb" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.678488 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-77v8x" podUID="a518542e-e1c4-4754-9031-d3f1571abb27" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.727423 4792 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.727474 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.768412 4792 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-gqm44 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.768753 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" podUID="8927cd79-8eff-4f53-a676-782cbb366e9c" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.924911 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": context deadline exceeded" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.925536 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.926089 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.926091 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": context deadline exceeded" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.955745 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.955821 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.956954 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:34 crc kubenswrapper[4792]: I0318 16:54:34.970414 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.035273 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-fb96d" podUID="5e4dd350-9a5b-4626-8b3d-6b9c097b4be1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.076186 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" podUID="05dba0ab-e659-4e0c-8713-4eebeca6edba" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.118304 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-5df8f6d8b4-s75wc" podUID="eb5bab1d-63b4-4ae0-8dfe-734700253a4f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.194278 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-z4nr9" podUID="96809e41-8656-4095-a2f9-9d69c31efe61" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.361238 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sqkxp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.74:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.361312 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" podUID="71255010-a6ae-4abf-88f1-f6c61c416ca1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.74:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.361429 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sqkxp container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.74:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.361455 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-sqkxp" podUID="71255010-a6ae-4abf-88f1-f6c61c416ca1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.74:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.765716 4792 patch_prober.go:28] interesting pod/metrics-server-7db98db598-wxffp container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.765716 4792 patch_prober.go:28] interesting pod/metrics-server-7db98db598-wxffp container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.765890 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" podUID="de45de1b-91b3-41ff-9f73-95048b051745" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:35 crc kubenswrapper[4792]: I0318 16:54:35.765837 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7db98db598-wxffp" podUID="de45de1b-91b3-41ff-9f73-95048b051745" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:36 crc kubenswrapper[4792]: I0318 16:54:36.136618 4792 patch_prober.go:28] interesting pod/monitoring-plugin-69764bd9c7-ntlfk container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:36 crc kubenswrapper[4792]: I0318 16:54:36.137011 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-69764bd9c7-ntlfk" podUID="7bb8e68c-2183-4f5a-88a3-8c274d017247" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:36 crc kubenswrapper[4792]: I0318 16:54:36.208407 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="fb295773-c070-4d90-b351-cac7e8fa1017" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:36 crc kubenswrapper[4792]: I0318 16:54:36.208614 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="fb295773-c070-4d90-b351-cac7e8fa1017" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:36 crc kubenswrapper[4792]: I0318 16:54:36.211997 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 16:54:36 crc kubenswrapper[4792]: I0318 16:54:36.888872 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="2b458d30-1f6c-4042-989d-71e39a0aece2" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.22:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:36 crc kubenswrapper[4792]: I0318 16:54:36.889573 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="2b458d30-1f6c-4042-989d-71e39a0aece2" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.22:8080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.218188 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" podUID="8bdbd945-a92a-471b-8a37-c999fe503caa" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.218285 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-78dddf6df5-kxk85" podUID="8bdbd945-a92a-471b-8a37-c999fe503caa" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.366346 4792 patch_prober.go:28] interesting pod/thanos-querier-5f879c84-szp77 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.366412 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5f879c84-szp77" podUID="a13f5c90-59fc-4b23-bf4d-0d4de34083e9" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.921835 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.921950 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.922531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-kvj8m" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.922266 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.925705 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"c838efb357e9b230f9969499ec5989440f216e571e2fba7ccf75350f1f500ecb"} pod="metallb-system/frr-k8s-kvj8m" containerMessage="Container frr failed liveness probe, will be restarted" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.931506 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-s4q9g" podUID="c7af6f36-f51d-4d49-85d2-5d4081ad57a6" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 18 16:54:37 crc kubenswrapper[4792]: I0318 16:54:37.932011 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="frr" containerID="cri-o://c838efb357e9b230f9969499ec5989440f216e571e2fba7ccf75350f1f500ecb" gracePeriod=2 Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.004346 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-9k68t" podUID="260602fc-bedf-40ec-92e7-a96e3ee009f0" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.004589 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-9k68t" podUID="260602fc-bedf-40ec-92e7-a96e3ee009f0" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.415430 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" podUID="423d82c6-fd0b-4cb5-8ff2-501f479a9a73" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.457380 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-86pbc" podUID="423d82c6-fd0b-4cb5-8ff2-501f479a9a73" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.457486 4792 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-pk6tp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.457583 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" podUID="57f5df54-714a-4cce-970a-7069ffd1cb63" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.457629 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.470492 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"6716d080072bceadd2ed5e3ce9167274a019a1d2afa7a5fdd7e6168de9ae7eff"} pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.470601 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" podUID="57f5df54-714a-4cce-970a-7069ffd1cb63" containerName="authentication-operator" containerID="cri-o://6716d080072bceadd2ed5e3ce9167274a019a1d2afa7a5fdd7e6168de9ae7eff" gracePeriod=30 Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.656941 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.657020 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.657144 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.657213 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.668910 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.669094 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.671584 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"a081cddc76226392f2f0bf7390018718acdea88add2769fee03d4a57ae80a172"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.671656 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" containerID="cri-o://a081cddc76226392f2f0bf7390018718acdea88add2769fee03d4a57ae80a172" gracePeriod=30 Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.924196 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.924566 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.924619 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.924740 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.924793 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ltvvk" podUID="01af365e-5f9a-4030-b54e-ebee4cf39552" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.924198 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-ltvvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.925023 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ltvvk" podUID="01af365e-5f9a-4030-b54e-ebee4cf39552" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.924243 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.925075 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.925118 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.926237 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"2f0cc99480209f97cadc4df5d5eb1a81d298e54cea1154fceb6c603cd4b73530"} pod="openshift-console-operator/console-operator-58897d9998-bt6v4" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.926276 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" containerID="cri-o://2f0cc99480209f97cadc4df5d5eb1a81d298e54cea1154fceb6c603cd4b73530" gracePeriod=30 Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.930772 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.931012 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.931134 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.936466 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ec93232a-54d0-42a4-a659-ed6fc86913c6" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.936580 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.938267 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"b9ca72fee08cecac34ea630ed28653def4e27ceec992fc19ffc782d6dc32d4a3"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 18 16:54:38 crc kubenswrapper[4792]: I0318 16:54:38.938384 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec93232a-54d0-42a4-a659-ed6fc86913c6" containerName="ceilometer-central-agent" containerID="cri-o://b9ca72fee08cecac34ea630ed28653def4e27ceec992fc19ffc782d6dc32d4a3" gracePeriod=30 Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006174 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006240 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006342 4792 patch_prober.go:28] interesting pod/router-default-5444994796-tfth7 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006373 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-tfth7" podUID="a0f79eb1-598c-4d72-af53-a928520fa0d9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006421 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xmfwt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006439 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" podUID="f75c6f2b-0965-4e4d-9445-dc65d69c970b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006481 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xmfwt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006500 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xmfwt" podUID="f75c6f2b-0965-4e4d-9445-dc65d69c970b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006686 4792 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mm4h4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.006711 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" podUID="6a509c1f-d106-4f35-9226-58a6779b738b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.007852 4792 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mm4h4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.007880 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mm4h4" podUID="6a509c1f-d106-4f35-9226-58a6779b738b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.213474 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="fb295773-c070-4d90-b351-cac7e8fa1017" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.284776 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:39 crc kubenswrapper[4792]: > Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.287282 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-6lntk" podUID="1f303ba2-d191-4ad6-a474-de409ea5475b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:39 crc kubenswrapper[4792]: > Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.287778 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-kzm75" podUID="dcf562be-1a5c-41e2-9355-706b833cb56e" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:39 crc kubenswrapper[4792]: > Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.291396 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-kzm75" podUID="dcf562be-1a5c-41e2-9355-706b833cb56e" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:39 crc kubenswrapper[4792]: > Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.292431 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6lntk" podUID="1f303ba2-d191-4ad6-a474-de409ea5475b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:39 crc kubenswrapper[4792]: > Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.295664 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:39 crc kubenswrapper[4792]: > Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.298834 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-chj47" podUID="143df0e5-40e9-4536-8285-509497426831" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:39 crc kubenswrapper[4792]: > Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.378124 4792 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-2jfcp container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.378178 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" podUID="7d216655-c83a-4f17-9e9a-367579911a35" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.378231 4792 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-2jfcp container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.378242 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jfcp" podUID="7d216655-c83a-4f17-9e9a-367579911a35" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.461180 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-kgc9b" podUID="ac5ba665-4ead-4469-9d1c-c777bf26d579" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.461515 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-kgc9b" podUID="ac5ba665-4ead-4469-9d1c-c777bf26d579" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.633161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerDied","Data":"c838efb357e9b230f9969499ec5989440f216e571e2fba7ccf75350f1f500ecb"} Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.634401 4792 generic.go:334] "Generic (PLEG): container finished" podID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerID="c838efb357e9b230f9969499ec5989440f216e571e2fba7ccf75350f1f500ecb" exitCode=143 Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.670387 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.670460 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.925460 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-7f8hf container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.925534 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-7f8hf" podUID="ff112f55-c823-4d01-a355-08279e6a0391" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.931438 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.931595 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.935277 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.935366 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.935453 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-gkmkk" podUID="cc92711e-be7a-4025-9077-cac9e5bc7df8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.937727 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-gkmkk" podUID="cc92711e-be7a-4025-9077-cac9e5bc7df8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.955677 4792 patch_prober.go:28] interesting pod/logging-loki-gateway-599d7cd94d-c8sjl container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:39 crc kubenswrapper[4792]: I0318 16:54:39.955762 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-599d7cd94d-c8sjl" podUID="e0054d36-2f0d-43c8-93d2-774d775a22ea" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.183258 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" podUID="155eb4c3-aa63-4ec7-9824-1bef2045a68b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.183573 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" podUID="155eb4c3-aa63-4ec7-9824-1bef2045a68b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.183666 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.364126 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.647935 4792 generic.go:334] "Generic (PLEG): container finished" podID="fbcfdc60-25a6-41e2-8dc1-eb9093393808" containerID="30bd6beaef6a77d24b34b0a75c102d53bf848328f730ff39fcfd76d7baf6ed05" exitCode=1 Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.648014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" event={"ID":"fbcfdc60-25a6-41e2-8dc1-eb9093393808","Type":"ContainerDied","Data":"30bd6beaef6a77d24b34b0a75c102d53bf848328f730ff39fcfd76d7baf6ed05"} Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.648959 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.674913 4792 scope.go:117] "RemoveContainer" containerID="30bd6beaef6a77d24b34b0a75c102d53bf848328f730ff39fcfd76d7baf6ed05" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.837213 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" podUID="8cf0ba21-2c05-4e3d-8925-114487cc4998" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.837272 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" podUID="8cf0ba21-2c05-4e3d-8925-114487cc4998" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.837372 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.939742 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.939892 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.939963 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.940110 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.940193 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 16:54:40 crc kubenswrapper[4792]: I0318 16:54:40.942136 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.176318 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-zw8n7" podUID="e5ef8d1c-3435-4dcb-8397-2314c8795c3b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:41 crc kubenswrapper[4792]: timeout: health rpc did not complete within 1s Mar 18 16:54:41 crc kubenswrapper[4792]: > Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.262603 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-zw8n7" podUID="e5ef8d1c-3435-4dcb-8397-2314c8795c3b" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:41 crc kubenswrapper[4792]: timeout: health rpc did not complete within 1s Mar 18 16:54:41 crc kubenswrapper[4792]: > Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.297194 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" podUID="407238a6-a2e5-420c-801b-8a4329eebadd" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.297194 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" podUID="14667803-000a-4186-8eb1-da78ce4812a0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.338217 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-c79f466d7-95zwp" podUID="14667803-000a-4186-8eb1-da78ce4812a0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.338268 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-7f795bfd45-wf9cm" podUID="407238a6-a2e5-420c-801b-8a4329eebadd" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.338231 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" podUID="155eb4c3-aa63-4ec7-9824-1bef2045a68b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.682714 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-bt6v4_106c799a-83d0-4815-ab5a-61c2b67b86f7/console-operator/0.log" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.683466 4792 generic.go:334] "Generic (PLEG): container finished" podID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerID="2f0cc99480209f97cadc4df5d5eb1a81d298e54cea1154fceb6c603cd4b73530" exitCode=1 Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.683583 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" event={"ID":"106c799a-83d0-4815-ab5a-61c2b67b86f7","Type":"ContainerDied","Data":"2f0cc99480209f97cadc4df5d5eb1a81d298e54cea1154fceb6c603cd4b73530"} Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.687608 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" event={"ID":"fbcfdc60-25a6-41e2-8dc1-eb9093393808","Type":"ContainerStarted","Data":"d01bc1dca468db14ac75a970825548a97527f5e043f9b80fde08ccb49e152f70"} Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.688790 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.702980 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kvj8m" event={"ID":"fb6ddafa-95ff-43b2-be7b-352a7fab9d05","Type":"ContainerStarted","Data":"a96844c38a81ecc36d09809848f7ac18bcaafcf3aa60ef996b97818fa2c10c88"} Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.794585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kvj8m" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.880219 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" podUID="8cf0ba21-2c05-4e3d-8925-114487cc4998" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.880578 4792 patch_prober.go:28] interesting pod/oauth-openshift-86567d79f8-v9v86 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.880611 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.880682 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881530 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881558 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881792 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881837 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881909 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881954 4792 patch_prober.go:28] interesting pod/oauth-openshift-86567d79f8-v9v86 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": context deadline exceeded" start-of-body= Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.881999 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": context deadline exceeded" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.882029 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.882651 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus-operator-admission-webhook" containerStatusID={"Type":"cri-o","ID":"03bb699f6af10beb26f633c4d3cd5b1a536082ecfeb4fcce3ee5370856a694e4"} pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" containerMessage="Container prometheus-operator-admission-webhook failed liveness probe, will be restarted" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.882683 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" containerID="cri-o://03bb699f6af10beb26f633c4d3cd5b1a536082ecfeb4fcce3ee5370856a694e4" gracePeriod=30 Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.930298 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:41 crc kubenswrapper[4792]: I0318 16:54:41.930304 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" probeResult="failure" output="command timed out" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.142415 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rfpd8" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:42 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:42 crc kubenswrapper[4792]: > Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.501524 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.501778 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.721062 4792 generic.go:334] "Generic (PLEG): container finished" podID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerID="a081cddc76226392f2f0bf7390018718acdea88add2769fee03d4a57ae80a172" exitCode=0 Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.721136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" event={"ID":"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073","Type":"ContainerDied","Data":"a081cddc76226392f2f0bf7390018718acdea88add2769fee03d4a57ae80a172"} Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.728246 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.731486 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.735782 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.735854 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8ec9fdc43dd0c98d404cb7c336f638c250927d8cf6308efa9ac38d9f4433d8b7" exitCode=1 Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.735961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8ec9fdc43dd0c98d404cb7c336f638c250927d8cf6308efa9ac38d9f4433d8b7"} Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.736014 4792 scope.go:117] "RemoveContainer" containerID="07102eb834ab562b238a7946fe83b050cd7a305a94c5cadadcc385f733854cb4" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.737978 4792 scope.go:117] "RemoveContainer" containerID="8ec9fdc43dd0c98d404cb7c336f638c250927d8cf6308efa9ac38d9f4433d8b7" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.739489 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c438c99-c0c4-43ec-a5e7-33a18425e63f" containerID="7b1746bbd148f1baaca4c85d65c6817269b56d83496fb8702869ce33e38412cc" exitCode=1 Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.739544 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" event={"ID":"2c438c99-c0c4-43ec-a5e7-33a18425e63f","Type":"ContainerDied","Data":"7b1746bbd148f1baaca4c85d65c6817269b56d83496fb8702869ce33e38412cc"} Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.741278 4792 scope.go:117] "RemoveContainer" containerID="7b1746bbd148f1baaca4c85d65c6817269b56d83496fb8702869ce33e38412cc" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.742380 4792 generic.go:334] "Generic (PLEG): container finished" podID="8cf0ba21-2c05-4e3d-8925-114487cc4998" containerID="1ad34a75b919ab47351fb5a8af42351a451242dbb82605331b9aeb32d8ed9722" exitCode=1 Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.742444 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" event={"ID":"8cf0ba21-2c05-4e3d-8925-114487cc4998","Type":"ContainerDied","Data":"1ad34a75b919ab47351fb5a8af42351a451242dbb82605331b9aeb32d8ed9722"} Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.742898 4792 scope.go:117] "RemoveContainer" containerID="1ad34a75b919ab47351fb5a8af42351a451242dbb82605331b9aeb32d8ed9722" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.750815 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-bt6v4_106c799a-83d0-4815-ab5a-61c2b67b86f7/console-operator/0.log" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.750938 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" event={"ID":"106c799a-83d0-4815-ab5a-61c2b67b86f7","Type":"ContainerStarted","Data":"0836dc1238f6444192dd749ca53c6d652c61cb5cfbcb258ae1926825db8137ed"} Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.751495 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.751541 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.753230 4792 generic.go:334] "Generic (PLEG): container finished" podID="171cc469-d2f4-4ca4-b841-144bb81881be" containerID="03bb699f6af10beb26f633c4d3cd5b1a536082ecfeb4fcce3ee5370856a694e4" exitCode=0 Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.753878 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.753916 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" event={"ID":"171cc469-d2f4-4ca4-b841-144bb81881be","Type":"ContainerDied","Data":"03bb699f6af10beb26f633c4d3cd5b1a536082ecfeb4fcce3ee5370856a694e4"} Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.755771 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-openshift" containerStatusID={"Type":"cri-o","ID":"10168ff951a284c9895622bfa17204f92cec700200ba956dfbb3754706a57c2a"} pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" containerMessage="Container oauth-openshift failed liveness probe, will be restarted" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.836152 4792 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-kvj8m" podUID="fb6ddafa-95ff-43b2-be7b-352a7fab9d05" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.847704 4792 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.847811 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.847868 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.862889 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-scheduler" containerStatusID={"Type":"cri-o","ID":"b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9"} pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" containerMessage="Container kube-scheduler failed liveness probe, will be restarted" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.863041 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" containerID="cri-o://b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9" gracePeriod=30 Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.940539 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="0261599b-51cd-4d30-8c8b-d146dc22de90" containerName="prometheus" probeResult="failure" output="command timed out" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.943326 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ec93232a-54d0-42a4-a659-ed6fc86913c6" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.957253 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.957315 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.968356 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.968428 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.968497 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.968878 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.968948 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.969015 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.969028 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.969064 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.969704 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"7e87a275953afd7ac06d802f094c79c4894a77a0b2695c2b6a202ba59e9db1d8"} pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" containerMessage="Container controller-manager failed liveness probe, will be restarted" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.969744 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" containerID="cri-o://7e87a275953afd7ac06d802f094c79c4894a77a0b2695c2b6a202ba59e9db1d8" gracePeriod=30 Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.969895 4792 patch_prober.go:28] interesting pod/oauth-openshift-86567d79f8-v9v86 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.969916 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.972699 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"e7b9506860d932602869ba0047423b6c6cdbabd65989660202038098d0dda043"} pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 18 16:54:42 crc kubenswrapper[4792]: I0318 16:54:42.972751 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" containerID="cri-o://e7b9506860d932602869ba0047423b6c6cdbabd65989660202038098d0dda043" gracePeriod=30 Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.248886 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.511726 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.518110 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.518168 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.727190 4792 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" start-of-body= Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.727536 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.827167 4792 generic.go:334] "Generic (PLEG): container finished" podID="57f5df54-714a-4cce-970a-7069ffd1cb63" containerID="6716d080072bceadd2ed5e3ce9167274a019a1d2afa7a5fdd7e6168de9ae7eff" exitCode=0 Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.827238 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" event={"ID":"57f5df54-714a-4cce-970a-7069ffd1cb63","Type":"ContainerDied","Data":"6716d080072bceadd2ed5e3ce9167274a019a1d2afa7a5fdd7e6168de9ae7eff"} Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.847066 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.854536 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.878770 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.887897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" event={"ID":"7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073","Type":"ContainerStarted","Data":"7dec9f641fd833c192d045aecedd8602d230d995229b4cc0bf0e925013f40a04"} Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.887959 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.888758 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.888815 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.911003 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" event={"ID":"2c438c99-c0c4-43ec-a5e7-33a18425e63f","Type":"ContainerStarted","Data":"fe9d45aa4d313d680b9ea19ed69c92478d5282fdb408cc3ea139e4d2cc6580df"} Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.911384 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.937193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" event={"ID":"8cf0ba21-2c05-4e3d-8925-114487cc4998","Type":"ContainerStarted","Data":"48bee00541824dd1a32e114ad52fb2698527ea53c89efbf5e645a6a985aba9ce"} Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.937472 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.946611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" event={"ID":"171cc469-d2f4-4ca4-b841-144bb81881be","Type":"ContainerStarted","Data":"663cb88a4415480a7b2181f8e0ed2b2fbfb0aaf2da8ffe4a79353c0ec8f3a876"} Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.947126 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.947195 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" start-of-body= Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.947223 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.950230 4792 generic.go:334] "Generic (PLEG): container finished" podID="05dba0ab-e659-4e0c-8713-4eebeca6edba" containerID="5427ca2d12b9cec1ba004075d94d14c0c5b70c62c051f4dfefc1235af6d79492" exitCode=1 Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.951313 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" event={"ID":"05dba0ab-e659-4e0c-8713-4eebeca6edba","Type":"ContainerDied","Data":"5427ca2d12b9cec1ba004075d94d14c0c5b70c62c051f4dfefc1235af6d79492"} Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.951648 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.951677 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 18 16:54:43 crc kubenswrapper[4792]: I0318 16:54:43.958200 4792 scope.go:117] "RemoveContainer" containerID="5427ca2d12b9cec1ba004075d94d14c0c5b70c62c051f4dfefc1235af6d79492" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.013622 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.013681 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.102240 4792 patch_prober.go:28] interesting pod/perses-operator-d9577b4dd-zfrmv container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.102314 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.102797 4792 patch_prober.go:28] interesting pod/perses-operator-d9577b4dd-zfrmv container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.6:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.102867 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" podUID="15bde542-1ffd-48b4-b2cf-98d98348920e" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.102952 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.252253 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" podUID="af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.252328 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-nvs5w" podUID="af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.386217 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.477737 4792 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-dfcx2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.477806 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" podUID="1c367fec-09d4-46fa-8900-0c508ced5de9" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.477877 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.639149 4792 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-86w5q container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.639199 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" podUID="f9dbb2aa-f06a-431d-b181-29315e9170cb" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.639273 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.661847 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-d9577b4dd-zfrmv" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.767827 4792 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-gqm44 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.767932 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" podUID="8927cd79-8eff-4f53-a676-782cbb366e9c" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.768092 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.782719 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-dfcx2" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.855486 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-86w5q" Mar 18 16:54:44 crc kubenswrapper[4792]: I0318 16:54:44.944990 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-gqm44" Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.044494 4792 generic.go:334] "Generic (PLEG): container finished" podID="539835fd-4134-4cab-8c05-f7df74b38042" containerID="e7b9506860d932602869ba0047423b6c6cdbabd65989660202038098d0dda043" exitCode=0 Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.045874 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" event={"ID":"539835fd-4134-4cab-8c05-f7df74b38042","Type":"ContainerDied","Data":"e7b9506860d932602869ba0047423b6c6cdbabd65989660202038098d0dda043"} Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.065895 4792 generic.go:334] "Generic (PLEG): container finished" podID="d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512" containerID="65eba7399a3f8c7ae46488ad152addb6c5e00cabe6dd5f8816bc3ae525da88aa" exitCode=1 Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.066007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" event={"ID":"d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512","Type":"ContainerDied","Data":"65eba7399a3f8c7ae46488ad152addb6c5e00cabe6dd5f8816bc3ae525da88aa"} Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.075885 4792 generic.go:334] "Generic (PLEG): container finished" podID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerID="7e87a275953afd7ac06d802f094c79c4894a77a0b2695c2b6a202ba59e9db1d8" exitCode=0 Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.076251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" event={"ID":"18572155-5ab2-4ee2-bda9-3bd91f07b526","Type":"ContainerDied","Data":"7e87a275953afd7ac06d802f094c79c4894a77a0b2695c2b6a202ba59e9db1d8"} Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.078486 4792 scope.go:117] "RemoveContainer" containerID="65eba7399a3f8c7ae46488ad152addb6c5e00cabe6dd5f8816bc3ae525da88aa" Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.084260 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.084408 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.084435 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.084529 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.085954 4792 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-2jldb container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" start-of-body= Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.086211 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" podUID="171cc469-d2f4-4ca4-b841-144bb81881be" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.711031 4792 trace.go:236] Trace[1034104504]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (18-Mar-2026 16:54:40.568) (total time: 5121ms): Mar 18 16:54:45 crc kubenswrapper[4792]: Trace[1034104504]: [5.121803265s] [5.121803265s] END Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.749611 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" podUID="ae1d2de8-ac87-4f0e-97c5-3bbb88279055" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.96:8080/readyz\": dial tcp 10.217.0.96:8080: connect: connection refused" Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.902328 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" containerID="cri-o://138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a" gracePeriod=25 Mar 18 16:54:45 crc kubenswrapper[4792]: I0318 16:54:45.904868 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" containerID="cri-o://1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862" gracePeriod=26 Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.106495 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9" exitCode=0 Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.106590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b434f5ab1ec3c7d75561d6d85ae6f4ddb0c737418b0ef75ef9577afddc8999b9"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.109198 4792 generic.go:334] "Generic (PLEG): container finished" podID="ae1d2de8-ac87-4f0e-97c5-3bbb88279055" containerID="4b346a0aee5fa66b859a2a5f724e282e700e0798f7a48b37c59703fc45893388" exitCode=1 Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.109249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" event={"ID":"ae1d2de8-ac87-4f0e-97c5-3bbb88279055","Type":"ContainerDied","Data":"4b346a0aee5fa66b859a2a5f724e282e700e0798f7a48b37c59703fc45893388"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.110316 4792 scope.go:117] "RemoveContainer" containerID="4b346a0aee5fa66b859a2a5f724e282e700e0798f7a48b37c59703fc45893388" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.111851 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.115564 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.115984 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51c5f3c8cfe2a728eff29d6d110ecf8f1f99540be169139238387095cd98ffb1"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.121506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" event={"ID":"d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512","Type":"ContainerStarted","Data":"3b016d4ad9edcc79494fe3cf164f83e8a6762e440fbb6183613229c9394e7f21"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.122109 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.124501 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" event={"ID":"18572155-5ab2-4ee2-bda9-3bd91f07b526","Type":"ContainerStarted","Data":"bd5f43b5130d4ff6024af31549143abc48bf0f5b220f15493634bf3d7606883e"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.125500 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.125604 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" start-of-body= Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.125655 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.128174 4792 generic.go:334] "Generic (PLEG): container finished" podID="ec93232a-54d0-42a4-a659-ed6fc86913c6" containerID="b9ca72fee08cecac34ea630ed28653def4e27ceec992fc19ffc782d6dc32d4a3" exitCode=0 Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.128254 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec93232a-54d0-42a4-a659-ed6fc86913c6","Type":"ContainerDied","Data":"b9ca72fee08cecac34ea630ed28653def4e27ceec992fc19ffc782d6dc32d4a3"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.133497 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" event={"ID":"539835fd-4134-4cab-8c05-f7df74b38042","Type":"ContainerStarted","Data":"bbfd702f125ac4e1d92080bde57a9606039fefc4f5832fe99705c5946c6b90ff"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.136415 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.134644 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.136476 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.149820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pk6tp" event={"ID":"57f5df54-714a-4cce-970a-7069ffd1cb63","Type":"ContainerStarted","Data":"093a6929e30dfce51289030221818ddce78fa7061c401b434564308334f5832d"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.160027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" event={"ID":"05dba0ab-e659-4e0c-8713-4eebeca6edba","Type":"ContainerStarted","Data":"850e050367ff1d46a581646197a47a161577c8249391822ffc116ae88ef37187"} Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.161142 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.215230 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:46 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:46 crc kubenswrapper[4792]: > Mar 18 16:54:46 crc kubenswrapper[4792]: I0318 16:54:46.957760 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kvj8m" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.175881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e4e79c8c0c27dded7428c3dbb79609eefce1672cdb6133e82ed7a6c96d168e77"} Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.176424 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.180817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" event={"ID":"ae1d2de8-ac87-4f0e-97c5-3bbb88279055","Type":"ContainerStarted","Data":"084356d5d95943abbfaca168d9df69b13a74ac007157ae6075233f305d55d47d"} Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.181511 4792 status_manager.go:317] "Container readiness changed for unknown container" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" containerID="cri-o://4b346a0aee5fa66b859a2a5f724e282e700e0798f7a48b37c59703fc45893388" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.181548 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.181773 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" start-of-body= Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.181807 4792 patch_prober.go:28] interesting pod/route-controller-manager-5c8d5cd46d-j96gd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.181817 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.181835 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" podUID="539835fd-4134-4cab-8c05-f7df74b38042" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.656785 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.657116 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.656787 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-25vh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.657242 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" podUID="7d3b1cf6-cf28-4d83-a5e5-2a7cf1613073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.850598 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.850908 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.858486 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-bt6v4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 18 16:54:47 crc kubenswrapper[4792]: I0318 16:54:47.858555 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" podUID="106c799a-83d0-4815-ab5a-61c2b67b86f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 18 16:54:48 crc kubenswrapper[4792]: E0318 16:54:48.123030 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 16:54:48 crc kubenswrapper[4792]: E0318 16:54:48.124836 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 16:54:48 crc kubenswrapper[4792]: E0318 16:54:48.126293 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 16:54:48 crc kubenswrapper[4792]: E0318 16:54:48.126326 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerName="galera" Mar 18 16:54:48 crc kubenswrapper[4792]: I0318 16:54:48.200698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec93232a-54d0-42a4-a659-ed6fc86913c6","Type":"ContainerStarted","Data":"988d6c328af010959c1593d2efea89f5afcdf60f5a93ff4a1fc7f07274ac2dec"} Mar 18 16:54:48 crc kubenswrapper[4792]: I0318 16:54:48.203472 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 16:54:48 crc kubenswrapper[4792]: I0318 16:54:48.203605 4792 patch_prober.go:28] interesting pod/controller-manager-6477764b84-dhhrv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" start-of-body= Mar 18 16:54:48 crc kubenswrapper[4792]: I0318 16:54:48.203663 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" podUID="18572155-5ab2-4ee2-bda9-3bd91f07b526" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" Mar 18 16:54:48 crc kubenswrapper[4792]: I0318 16:54:48.469271 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="d41b9217-24bd-4b7c-98f7-04ec8ca9bf89" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:54:49 crc kubenswrapper[4792]: I0318 16:54:49.110330 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-chpgl" Mar 18 16:54:49 crc kubenswrapper[4792]: I0318 16:54:49.224091 4792 generic.go:334] "Generic (PLEG): container finished" podID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerID="1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862" exitCode=0 Mar 18 16:54:49 crc kubenswrapper[4792]: I0318 16:54:49.225231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a64098b6-eb41-40ef-8d9b-6dd69c107ee2","Type":"ContainerDied","Data":"1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862"} Mar 18 16:54:49 crc kubenswrapper[4792]: E0318 16:54:49.555793 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862 is running failed: container process not found" containerID="1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 16:54:49 crc kubenswrapper[4792]: E0318 16:54:49.556945 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862 is running failed: container process not found" containerID="1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 16:54:49 crc kubenswrapper[4792]: E0318 16:54:49.559734 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862 is running failed: container process not found" containerID="1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 16:54:49 crc kubenswrapper[4792]: E0318 16:54:49.559781 4792 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c08babd1d5901d58d86e22043b922bdb3309f2e1d695e1a052a096fe2072862 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a64098b6-eb41-40ef-8d9b-6dd69c107ee2" containerName="galera" Mar 18 16:54:49 crc kubenswrapper[4792]: I0318 16:54:49.763251 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp" Mar 18 16:54:50 crc kubenswrapper[4792]: I0318 16:54:50.240765 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a64098b6-eb41-40ef-8d9b-6dd69c107ee2","Type":"ContainerStarted","Data":"91409c382cc592a71093c6f42df3bd3f9a4172097c9466694f44be055e3d9bcb"} Mar 18 16:54:50 crc kubenswrapper[4792]: I0318 16:54:50.249352 4792 generic.go:334] "Generic (PLEG): container finished" podID="b38dfbae-0508-4b57-b5d8-d47fcdd35fd6" containerID="138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a" exitCode=0 Mar 18 16:54:50 crc kubenswrapper[4792]: I0318 16:54:50.249412 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6","Type":"ContainerDied","Data":"138ecac56a58b4805e65647753695c91bf78797de466589adddfa67719044a8a"} Mar 18 16:54:50 crc kubenswrapper[4792]: I0318 16:54:50.884614 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-2jldb" Mar 18 16:54:51 crc kubenswrapper[4792]: I0318 16:54:51.151438 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="d41b9217-24bd-4b7c-98f7-04ec8ca9bf89" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:54:51 crc kubenswrapper[4792]: I0318 16:54:51.267710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38dfbae-0508-4b57-b5d8-d47fcdd35fd6","Type":"ContainerStarted","Data":"432686ac497d7d8e9d36cd62a8bababc21edc6922822325548136d318a04d7ec"} Mar 18 16:54:51 crc kubenswrapper[4792]: I0318 16:54:51.877909 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c8d5cd46d-j96gd" Mar 18 16:54:51 crc kubenswrapper[4792]: I0318 16:54:51.903730 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6477764b84-dhhrv" Mar 18 16:54:52 crc kubenswrapper[4792]: I0318 16:54:52.252143 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rfpd8" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:52 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:52 crc kubenswrapper[4792]: > Mar 18 16:54:53 crc kubenswrapper[4792]: I0318 16:54:53.252559 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-gc45m" Mar 18 16:54:53 crc kubenswrapper[4792]: I0318 16:54:53.511253 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 16:54:53 crc kubenswrapper[4792]: I0318 16:54:53.521149 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7c8cdd9f9f-lv5d4" Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.016818 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-5c65h" Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.093578 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="d41b9217-24bd-4b7c-98f7-04ec8ca9bf89" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.093668 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.096434 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"341949ae4e0b6b9ca7fd9a86f851b16cae109de816ab7917c8bc28440310f62a"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.096505 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d41b9217-24bd-4b7c-98f7-04ec8ca9bf89" containerName="cinder-scheduler" containerID="cri-o://341949ae4e0b6b9ca7fd9a86f851b16cae109de816ab7917c8bc28440310f62a" gracePeriod=30 Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.115390 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-b7zpr" Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.387147 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 16:54:54 crc kubenswrapper[4792]: I0318 16:54:54.391396 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 16:54:55 crc kubenswrapper[4792]: I0318 16:54:55.318845 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 16:54:56 crc kubenswrapper[4792]: I0318 16:54:56.183724 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:54:56 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:54:56 crc kubenswrapper[4792]: > Mar 18 16:54:57 crc kubenswrapper[4792]: I0318 16:54:57.057650 4792 scope.go:117] "RemoveContainer" containerID="29b35c5c9e98c2668cc597b097a617f64a4f7b5ba5ea2e3fea96efbdc844279b" Mar 18 16:54:57 crc kubenswrapper[4792]: I0318 16:54:57.666683 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-25vh8" Mar 18 16:54:57 crc kubenswrapper[4792]: I0318 16:54:57.867944 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bt6v4" Mar 18 16:54:58 crc kubenswrapper[4792]: I0318 16:54:58.120227 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 16:54:58 crc kubenswrapper[4792]: I0318 16:54:58.120686 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 16:54:59 crc kubenswrapper[4792]: I0318 16:54:59.367487 4792 generic.go:334] "Generic (PLEG): container finished" podID="d41b9217-24bd-4b7c-98f7-04ec8ca9bf89" containerID="341949ae4e0b6b9ca7fd9a86f851b16cae109de816ab7917c8bc28440310f62a" exitCode=0 Mar 18 16:54:59 crc kubenswrapper[4792]: I0318 16:54:59.367826 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89","Type":"ContainerDied","Data":"341949ae4e0b6b9ca7fd9a86f851b16cae109de816ab7917c8bc28440310f62a"} Mar 18 16:54:59 crc kubenswrapper[4792]: I0318 16:54:59.552407 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 16:54:59 crc kubenswrapper[4792]: I0318 16:54:59.552465 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 16:55:00 crc kubenswrapper[4792]: I0318 16:55:00.322241 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:55:00 crc kubenswrapper[4792]: I0318 16:55:00.322632 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:55:01 crc kubenswrapper[4792]: I0318 16:55:01.161569 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:55:01 crc kubenswrapper[4792]: I0318 16:55:01.220531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:55:01 crc kubenswrapper[4792]: I0318 16:55:01.391107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d41b9217-24bd-4b7c-98f7-04ec8ca9bf89","Type":"ContainerStarted","Data":"53f519191a033b340539b3616cb214e7bdfe092412ea6c5609912d200564ee80"} Mar 18 16:55:01 crc kubenswrapper[4792]: I0318 16:55:01.424599 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfpd8"] Mar 18 16:55:02 crc kubenswrapper[4792]: I0318 16:55:02.400766 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rfpd8" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" containerID="cri-o://edecd53cc19d803b55df92be90939338bf32b6ca4386a1916dd0cba529bb8f95" gracePeriod=2 Mar 18 16:55:03 crc kubenswrapper[4792]: I0318 16:55:03.415021 4792 generic.go:334] "Generic (PLEG): container finished" podID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerID="edecd53cc19d803b55df92be90939338bf32b6ca4386a1916dd0cba529bb8f95" exitCode=0 Mar 18 16:55:03 crc kubenswrapper[4792]: I0318 16:55:03.415111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfpd8" event={"ID":"3cb12145-fbf5-43d5-8413-bf81eb70e3f6","Type":"ContainerDied","Data":"edecd53cc19d803b55df92be90939338bf32b6ca4386a1916dd0cba529bb8f95"} Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.348155 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.429802 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m58l\" (UniqueName: \"kubernetes.io/projected/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-kube-api-access-5m58l\") pod \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.430313 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-utilities\") pod \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.430343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-catalog-content\") pod \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\" (UID: \"3cb12145-fbf5-43d5-8413-bf81eb70e3f6\") " Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.437449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-utilities" (OuterVolumeSpecName: "utilities") pod "3cb12145-fbf5-43d5-8413-bf81eb70e3f6" (UID: "3cb12145-fbf5-43d5-8413-bf81eb70e3f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.443639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rfpd8" event={"ID":"3cb12145-fbf5-43d5-8413-bf81eb70e3f6","Type":"ContainerDied","Data":"95213080f912d3d4296d72b351f520df18698f20e038fca8e07c363fa0e3acef"} Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.443866 4792 scope.go:117] "RemoveContainer" containerID="edecd53cc19d803b55df92be90939338bf32b6ca4386a1916dd0cba529bb8f95" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.443721 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rfpd8" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.453469 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-kube-api-access-5m58l" (OuterVolumeSpecName: "kube-api-access-5m58l") pod "3cb12145-fbf5-43d5-8413-bf81eb70e3f6" (UID: "3cb12145-fbf5-43d5-8413-bf81eb70e3f6"). InnerVolumeSpecName "kube-api-access-5m58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.494263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cb12145-fbf5-43d5-8413-bf81eb70e3f6" (UID: "3cb12145-fbf5-43d5-8413-bf81eb70e3f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.533491 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m58l\" (UniqueName: \"kubernetes.io/projected/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-kube-api-access-5m58l\") on node \"crc\" DevicePath \"\"" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.533800 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.533812 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cb12145-fbf5-43d5-8413-bf81eb70e3f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.539288 4792 scope.go:117] "RemoveContainer" containerID="e77ce8d1ede1f77ace1e704492c381d67e4e2aeec9c8557ea2301dbb6b0cd211" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.568824 4792 scope.go:117] "RemoveContainer" containerID="341f3f72cff203564e73b8e2b2eda6861717198e718dac20e09ac134c558ce44" Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.796140 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfpd8"] Mar 18 16:55:04 crc kubenswrapper[4792]: I0318 16:55:04.809262 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rfpd8"] Mar 18 16:55:05 crc kubenswrapper[4792]: I0318 16:55:05.871347 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" path="/var/lib/kubelet/pods/3cb12145-fbf5-43d5-8413-bf81eb70e3f6/volumes" Mar 18 16:55:06 crc kubenswrapper[4792]: I0318 16:55:06.051006 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 16:55:06 crc kubenswrapper[4792]: I0318 16:55:06.069156 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="d41b9217-24bd-4b7c-98f7-04ec8ca9bf89" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:55:06 crc kubenswrapper[4792]: I0318 16:55:06.185667 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:55:06 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:55:06 crc kubenswrapper[4792]: > Mar 18 16:55:07 crc kubenswrapper[4792]: I0318 16:55:07.850146 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" podUID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerName="oauth-openshift" containerID="cri-o://10168ff951a284c9895622bfa17204f92cec700200ba956dfbb3754706a57c2a" gracePeriod=15 Mar 18 16:55:08 crc kubenswrapper[4792]: I0318 16:55:08.498024 4792 generic.go:334] "Generic (PLEG): container finished" podID="f7b32fdd-97ca-4c56-8981-4bdff318a1e1" containerID="10168ff951a284c9895622bfa17204f92cec700200ba956dfbb3754706a57c2a" exitCode=0 Mar 18 16:55:08 crc kubenswrapper[4792]: I0318 16:55:08.498069 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" event={"ID":"f7b32fdd-97ca-4c56-8981-4bdff318a1e1","Type":"ContainerDied","Data":"10168ff951a284c9895622bfa17204f92cec700200ba956dfbb3754706a57c2a"} Mar 18 16:55:09 crc kubenswrapper[4792]: E0318 16:55:09.184321 4792 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:40176->38.102.83.158:37731: write tcp 38.102.83.158:40176->38.102.83.158:37731: write: connection reset by peer Mar 18 16:55:09 crc kubenswrapper[4792]: E0318 16:55:09.452300 4792 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:40190->38.102.83.158:37731: write tcp 38.102.83.158:40190->38.102.83.158:37731: write: connection reset by peer Mar 18 16:55:10 crc kubenswrapper[4792]: I0318 16:55:10.525097 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" event={"ID":"f7b32fdd-97ca-4c56-8981-4bdff318a1e1","Type":"ContainerStarted","Data":"a195ac61dd265495b14ea426f781edad4ef4a58871932f430736566e711e13f3"} Mar 18 16:55:10 crc kubenswrapper[4792]: I0318 16:55:10.525482 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 16:55:10 crc kubenswrapper[4792]: I0318 16:55:10.576765 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86567d79f8-v9v86" Mar 18 16:55:10 crc kubenswrapper[4792]: I0318 16:55:10.888829 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 16:55:11 crc kubenswrapper[4792]: I0318 16:55:11.089802 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 16:55:11 crc kubenswrapper[4792]: I0318 16:55:11.405558 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="d41b9217-24bd-4b7c-98f7-04ec8ca9bf89" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:55:12 crc kubenswrapper[4792]: I0318 16:55:12.965034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 16:55:13 crc kubenswrapper[4792]: I0318 16:55:13.834017 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 16:55:16 crc kubenswrapper[4792]: I0318 16:55:16.078776 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 16:55:16 crc kubenswrapper[4792]: I0318 16:55:16.294431 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:55:16 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:55:16 crc kubenswrapper[4792]: > Mar 18 16:55:25 crc kubenswrapper[4792]: I0318 16:55:25.775450 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7cfbbd978d-5f96z" Mar 18 16:55:26 crc kubenswrapper[4792]: I0318 16:55:26.197179 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:55:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:55:26 crc kubenswrapper[4792]: > Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.322816 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.325600 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.325664 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.328283 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6f1c178a409c5f0355d650182975948a9c12e1883a5050caf7fccad5d287466"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.329239 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://a6f1c178a409c5f0355d650182975948a9c12e1883a5050caf7fccad5d287466" gracePeriod=600 Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.862007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"a6f1c178a409c5f0355d650182975948a9c12e1883a5050caf7fccad5d287466"} Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.862575 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="a6f1c178a409c5f0355d650182975948a9c12e1883a5050caf7fccad5d287466" exitCode=0 Mar 18 16:55:30 crc kubenswrapper[4792]: I0318 16:55:30.866446 4792 scope.go:117] "RemoveContainer" containerID="68ae1453eb19e87fb5a5d392014b0ce25c8943da1e966a6b4ac09898f0f6d470" Mar 18 16:55:31 crc kubenswrapper[4792]: I0318 16:55:31.880640 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd"} Mar 18 16:55:33 crc kubenswrapper[4792]: I0318 16:55:33.731899 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 16:55:36 crc kubenswrapper[4792]: I0318 16:55:36.223332 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:55:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:55:36 crc kubenswrapper[4792]: > Mar 18 16:55:36 crc kubenswrapper[4792]: I0318 16:55:36.224476 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:55:36 crc kubenswrapper[4792]: I0318 16:55:36.226136 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"26e076b2f4d3790d400522bb4eab3dc1b9002260a2b2ba29796761f2ec1ccf65"} pod="openshift-marketplace/redhat-operators-7g6m5" containerMessage="Container registry-server failed startup probe, will be restarted" Mar 18 16:55:36 crc kubenswrapper[4792]: I0318 16:55:36.226197 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" containerID="cri-o://26e076b2f4d3790d400522bb4eab3dc1b9002260a2b2ba29796761f2ec1ccf65" gracePeriod=30 Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.223997 4792 generic.go:334] "Generic (PLEG): container finished" podID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerID="26e076b2f4d3790d400522bb4eab3dc1b9002260a2b2ba29796761f2ec1ccf65" exitCode=0 Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.224013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerDied","Data":"26e076b2f4d3790d400522bb4eab3dc1b9002260a2b2ba29796761f2ec1ccf65"} Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.224480 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerStarted","Data":"703b2ad4db989bac2dec66061862e328345c917db913de4aaa230dd70132b1da"} Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.664677 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564216-zn66p"] Mar 18 16:56:00 crc kubenswrapper[4792]: E0318 16:56:00.671275 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.671317 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" Mar 18 16:56:00 crc kubenswrapper[4792]: E0318 16:56:00.671796 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a2888c-51a4-4f58-8bab-671f62c3c29f" containerName="oc" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.671817 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a2888c-51a4-4f58-8bab-671f62c3c29f" containerName="oc" Mar 18 16:56:00 crc kubenswrapper[4792]: E0318 16:56:00.671846 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="extract-content" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.671858 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="extract-content" Mar 18 16:56:00 crc kubenswrapper[4792]: E0318 16:56:00.671941 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="extract-utilities" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.671951 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="extract-utilities" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.676013 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a2888c-51a4-4f58-8bab-671f62c3c29f" containerName="oc" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.676085 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb12145-fbf5-43d5-8413-bf81eb70e3f6" containerName="registry-server" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.684895 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-zn66p" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.698984 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.699040 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.699057 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.754688 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-zn66p"] Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.765153 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwnx\" (UniqueName: \"kubernetes.io/projected/23ab00b2-385f-4a41-bf3b-3a85c8121d3f-kube-api-access-5dwnx\") pod \"auto-csr-approver-29564216-zn66p\" (UID: \"23ab00b2-385f-4a41-bf3b-3a85c8121d3f\") " pod="openshift-infra/auto-csr-approver-29564216-zn66p" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.813503 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxl8x"] Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.816523 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.832046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxl8x"] Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.868202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dwnx\" (UniqueName: \"kubernetes.io/projected/23ab00b2-385f-4a41-bf3b-3a85c8121d3f-kube-api-access-5dwnx\") pod \"auto-csr-approver-29564216-zn66p\" (UID: \"23ab00b2-385f-4a41-bf3b-3a85c8121d3f\") " pod="openshift-infra/auto-csr-approver-29564216-zn66p" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.924461 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dwnx\" (UniqueName: \"kubernetes.io/projected/23ab00b2-385f-4a41-bf3b-3a85c8121d3f-kube-api-access-5dwnx\") pod \"auto-csr-approver-29564216-zn66p\" (UID: \"23ab00b2-385f-4a41-bf3b-3a85c8121d3f\") " pod="openshift-infra/auto-csr-approver-29564216-zn66p" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.971214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-catalog-content\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.971345 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hff\" (UniqueName: \"kubernetes.io/projected/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-kube-api-access-x4hff\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:00 crc kubenswrapper[4792]: I0318 16:56:00.971568 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-utilities\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.023055 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-zn66p" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.073735 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-utilities\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.073829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-catalog-content\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.073900 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hff\" (UniqueName: \"kubernetes.io/projected/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-kube-api-access-x4hff\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.078780 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-utilities\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.079774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-catalog-content\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.101883 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hff\" (UniqueName: \"kubernetes.io/projected/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-kube-api-access-x4hff\") pod \"community-operators-sxl8x\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:01 crc kubenswrapper[4792]: I0318 16:56:01.139315 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:02 crc kubenswrapper[4792]: I0318 16:56:02.396553 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-zn66p"] Mar 18 16:56:02 crc kubenswrapper[4792]: I0318 16:56:02.416373 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxl8x"] Mar 18 16:56:03 crc kubenswrapper[4792]: I0318 16:56:03.285552 4792 generic.go:334] "Generic (PLEG): container finished" podID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerID="f42a0affbc5ac27c4f4d5d53851d6a78b918079292472be66ede8c6a9dc8b466" exitCode=0 Mar 18 16:56:03 crc kubenswrapper[4792]: I0318 16:56:03.285690 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxl8x" event={"ID":"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6","Type":"ContainerDied","Data":"f42a0affbc5ac27c4f4d5d53851d6a78b918079292472be66ede8c6a9dc8b466"} Mar 18 16:56:03 crc kubenswrapper[4792]: I0318 16:56:03.286156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxl8x" event={"ID":"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6","Type":"ContainerStarted","Data":"01a178235b7859b38bd5e42a2f46329bc022ef1452e83495be506b74ae25b76a"} Mar 18 16:56:03 crc kubenswrapper[4792]: I0318 16:56:03.296107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564216-zn66p" event={"ID":"23ab00b2-385f-4a41-bf3b-3a85c8121d3f","Type":"ContainerStarted","Data":"44571765a8ca91fe180f613894edf668d60e82eace442da07ee3e2a4e571d9c9"} Mar 18 16:56:04 crc kubenswrapper[4792]: I0318 16:56:04.312616 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxl8x" event={"ID":"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6","Type":"ContainerStarted","Data":"ff1127b43be9b28bc9523b5fffde98a8baaa065d58f2bb2da87d9ecb54ce000d"} Mar 18 16:56:05 crc kubenswrapper[4792]: I0318 16:56:05.123856 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:56:05 crc kubenswrapper[4792]: I0318 16:56:05.124170 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:56:05 crc kubenswrapper[4792]: I0318 16:56:05.326550 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564216-zn66p" event={"ID":"23ab00b2-385f-4a41-bf3b-3a85c8121d3f","Type":"ContainerStarted","Data":"50850798bf3f86e8e3538901dfc510052e7affdfd5bbe72e9b23af0ad1991130"} Mar 18 16:56:05 crc kubenswrapper[4792]: I0318 16:56:05.360916 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564216-zn66p" podStartSLOduration=4.494134901 podStartE2EDuration="5.35950738s" podCreationTimestamp="2026-03-18 16:56:00 +0000 UTC" firstStartedPulling="2026-03-18 16:56:02.41538826 +0000 UTC m=+4911.284717197" lastFinishedPulling="2026-03-18 16:56:03.280760739 +0000 UTC m=+4912.150089676" observedRunningTime="2026-03-18 16:56:05.349508631 +0000 UTC m=+4914.218837568" watchObservedRunningTime="2026-03-18 16:56:05.35950738 +0000 UTC m=+4914.228836317" Mar 18 16:56:06 crc kubenswrapper[4792]: I0318 16:56:06.211984 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:06 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:06 crc kubenswrapper[4792]: > Mar 18 16:56:07 crc kubenswrapper[4792]: I0318 16:56:07.360355 4792 generic.go:334] "Generic (PLEG): container finished" podID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerID="ff1127b43be9b28bc9523b5fffde98a8baaa065d58f2bb2da87d9ecb54ce000d" exitCode=0 Mar 18 16:56:07 crc kubenswrapper[4792]: I0318 16:56:07.361051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxl8x" event={"ID":"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6","Type":"ContainerDied","Data":"ff1127b43be9b28bc9523b5fffde98a8baaa065d58f2bb2da87d9ecb54ce000d"} Mar 18 16:56:07 crc kubenswrapper[4792]: I0318 16:56:07.370433 4792 generic.go:334] "Generic (PLEG): container finished" podID="23ab00b2-385f-4a41-bf3b-3a85c8121d3f" containerID="50850798bf3f86e8e3538901dfc510052e7affdfd5bbe72e9b23af0ad1991130" exitCode=0 Mar 18 16:56:07 crc kubenswrapper[4792]: I0318 16:56:07.370486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564216-zn66p" event={"ID":"23ab00b2-385f-4a41-bf3b-3a85c8121d3f","Type":"ContainerDied","Data":"50850798bf3f86e8e3538901dfc510052e7affdfd5bbe72e9b23af0ad1991130"} Mar 18 16:56:08 crc kubenswrapper[4792]: I0318 16:56:08.386428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxl8x" event={"ID":"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6","Type":"ContainerStarted","Data":"066c7d5924c5fbb571c9ca0cd5fb16e4b38c8c31ad7b7383cf0df60147f486b4"} Mar 18 16:56:08 crc kubenswrapper[4792]: I0318 16:56:08.415398 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxl8x" podStartSLOduration=3.7521079459999997 podStartE2EDuration="8.415381693s" podCreationTimestamp="2026-03-18 16:56:00 +0000 UTC" firstStartedPulling="2026-03-18 16:56:03.294445885 +0000 UTC m=+4912.163774812" lastFinishedPulling="2026-03-18 16:56:07.957719622 +0000 UTC m=+4916.827048559" observedRunningTime="2026-03-18 16:56:08.410299991 +0000 UTC m=+4917.279628928" watchObservedRunningTime="2026-03-18 16:56:08.415381693 +0000 UTC m=+4917.284710630" Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.133135 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-zn66p" Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.313565 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dwnx\" (UniqueName: \"kubernetes.io/projected/23ab00b2-385f-4a41-bf3b-3a85c8121d3f-kube-api-access-5dwnx\") pod \"23ab00b2-385f-4a41-bf3b-3a85c8121d3f\" (UID: \"23ab00b2-385f-4a41-bf3b-3a85c8121d3f\") " Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.335178 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ab00b2-385f-4a41-bf3b-3a85c8121d3f-kube-api-access-5dwnx" (OuterVolumeSpecName: "kube-api-access-5dwnx") pod "23ab00b2-385f-4a41-bf3b-3a85c8121d3f" (UID: "23ab00b2-385f-4a41-bf3b-3a85c8121d3f"). InnerVolumeSpecName "kube-api-access-5dwnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.399832 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564216-zn66p" event={"ID":"23ab00b2-385f-4a41-bf3b-3a85c8121d3f","Type":"ContainerDied","Data":"44571765a8ca91fe180f613894edf668d60e82eace442da07ee3e2a4e571d9c9"} Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.399939 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564216-zn66p" Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.401007 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44571765a8ca91fe180f613894edf668d60e82eace442da07ee3e2a4e571d9c9" Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.417077 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dwnx\" (UniqueName: \"kubernetes.io/projected/23ab00b2-385f-4a41-bf3b-3a85c8121d3f-kube-api-access-5dwnx\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.531745 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-xg7wz"] Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.545794 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564210-xg7wz"] Mar 18 16:56:09 crc kubenswrapper[4792]: I0318 16:56:09.886412 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aad12d5-33fc-4adf-8c9d-d603876c4557" path="/var/lib/kubelet/pods/1aad12d5-33fc-4adf-8c9d-d603876c4557/volumes" Mar 18 16:56:11 crc kubenswrapper[4792]: I0318 16:56:11.140531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:11 crc kubenswrapper[4792]: I0318 16:56:11.141133 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:12 crc kubenswrapper[4792]: I0318 16:56:12.217942 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sxl8x" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:12 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:12 crc kubenswrapper[4792]: > Mar 18 16:56:16 crc kubenswrapper[4792]: I0318 16:56:16.260066 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:16 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:16 crc kubenswrapper[4792]: > Mar 18 16:56:22 crc kubenswrapper[4792]: I0318 16:56:22.200413 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sxl8x" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:22 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:22 crc kubenswrapper[4792]: > Mar 18 16:56:26 crc kubenswrapper[4792]: I0318 16:56:26.181539 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:26 crc kubenswrapper[4792]: > Mar 18 16:56:31 crc kubenswrapper[4792]: I0318 16:56:31.439548 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:31 crc kubenswrapper[4792]: I0318 16:56:31.508460 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:32 crc kubenswrapper[4792]: I0318 16:56:32.005661 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxl8x"] Mar 18 16:56:32 crc kubenswrapper[4792]: I0318 16:56:32.695855 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxl8x" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="registry-server" containerID="cri-o://066c7d5924c5fbb571c9ca0cd5fb16e4b38c8c31ad7b7383cf0df60147f486b4" gracePeriod=2 Mar 18 16:56:33 crc kubenswrapper[4792]: I0318 16:56:33.712017 4792 generic.go:334] "Generic (PLEG): container finished" podID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerID="066c7d5924c5fbb571c9ca0cd5fb16e4b38c8c31ad7b7383cf0df60147f486b4" exitCode=0 Mar 18 16:56:33 crc kubenswrapper[4792]: I0318 16:56:33.712128 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxl8x" event={"ID":"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6","Type":"ContainerDied","Data":"066c7d5924c5fbb571c9ca0cd5fb16e4b38c8c31ad7b7383cf0df60147f486b4"} Mar 18 16:56:33 crc kubenswrapper[4792]: I0318 16:56:33.934539 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:33 crc kubenswrapper[4792]: I0318 16:56:33.993366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-utilities\") pod \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " Mar 18 16:56:33 crc kubenswrapper[4792]: I0318 16:56:33.993530 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4hff\" (UniqueName: \"kubernetes.io/projected/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-kube-api-access-x4hff\") pod \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " Mar 18 16:56:33 crc kubenswrapper[4792]: I0318 16:56:33.993745 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-catalog-content\") pod \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\" (UID: \"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6\") " Mar 18 16:56:33 crc kubenswrapper[4792]: I0318 16:56:33.996596 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-utilities" (OuterVolumeSpecName: "utilities") pod "3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" (UID: "3d6d2cbc-a349-4bdc-918f-bb29c6b50df6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.025520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-kube-api-access-x4hff" (OuterVolumeSpecName: "kube-api-access-x4hff") pod "3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" (UID: "3d6d2cbc-a349-4bdc-918f-bb29c6b50df6"). InnerVolumeSpecName "kube-api-access-x4hff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.097527 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.097791 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4hff\" (UniqueName: \"kubernetes.io/projected/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-kube-api-access-x4hff\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.128411 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" (UID: "3d6d2cbc-a349-4bdc-918f-bb29c6b50df6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.199985 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.727138 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxl8x" event={"ID":"3d6d2cbc-a349-4bdc-918f-bb29c6b50df6","Type":"ContainerDied","Data":"01a178235b7859b38bd5e42a2f46329bc022ef1452e83495be506b74ae25b76a"} Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.727216 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxl8x" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.729214 4792 scope.go:117] "RemoveContainer" containerID="066c7d5924c5fbb571c9ca0cd5fb16e4b38c8c31ad7b7383cf0df60147f486b4" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.776620 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxl8x"] Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.799774 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxl8x"] Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.823567 4792 scope.go:117] "RemoveContainer" containerID="ff1127b43be9b28bc9523b5fffde98a8baaa065d58f2bb2da87d9ecb54ce000d" Mar 18 16:56:34 crc kubenswrapper[4792]: I0318 16:56:34.889107 4792 scope.go:117] "RemoveContainer" containerID="f42a0affbc5ac27c4f4d5d53851d6a78b918079292472be66ede8c6a9dc8b466" Mar 18 16:56:35 crc kubenswrapper[4792]: I0318 16:56:35.874491 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" path="/var/lib/kubelet/pods/3d6d2cbc-a349-4bdc-918f-bb29c6b50df6/volumes" Mar 18 16:56:36 crc kubenswrapper[4792]: I0318 16:56:36.200042 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:36 crc kubenswrapper[4792]: > Mar 18 16:56:46 crc kubenswrapper[4792]: I0318 16:56:46.179901 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:46 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:46 crc kubenswrapper[4792]: > Mar 18 16:56:56 crc kubenswrapper[4792]: I0318 16:56:56.180075 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" probeResult="failure" output=< Mar 18 16:56:56 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:56:56 crc kubenswrapper[4792]: > Mar 18 16:56:57 crc kubenswrapper[4792]: I0318 16:56:57.540382 4792 scope.go:117] "RemoveContainer" containerID="2864eda9fc36dd02bbc834e8711adc58c0d759dca48c487361d34d75e6fd4708" Mar 18 16:57:05 crc kubenswrapper[4792]: I0318 16:57:05.193981 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:57:05 crc kubenswrapper[4792]: I0318 16:57:05.247539 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:57:05 crc kubenswrapper[4792]: I0318 16:57:05.493523 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7g6m5"] Mar 18 16:57:07 crc kubenswrapper[4792]: I0318 16:57:07.115488 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7g6m5" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" containerID="cri-o://703b2ad4db989bac2dec66061862e328345c917db913de4aaa230dd70132b1da" gracePeriod=2 Mar 18 16:57:08 crc kubenswrapper[4792]: I0318 16:57:08.128372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerDied","Data":"703b2ad4db989bac2dec66061862e328345c917db913de4aaa230dd70132b1da"} Mar 18 16:57:08 crc kubenswrapper[4792]: I0318 16:57:08.129244 4792 scope.go:117] "RemoveContainer" containerID="26e076b2f4d3790d400522bb4eab3dc1b9002260a2b2ba29796761f2ec1ccf65" Mar 18 16:57:08 crc kubenswrapper[4792]: I0318 16:57:08.129855 4792 generic.go:334] "Generic (PLEG): container finished" podID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerID="703b2ad4db989bac2dec66061862e328345c917db913de4aaa230dd70132b1da" exitCode=0 Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.070798 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.105891 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p5pq\" (UniqueName: \"kubernetes.io/projected/14189167-a619-4f76-8f74-f3cdf2ca44c9-kube-api-access-5p5pq\") pod \"14189167-a619-4f76-8f74-f3cdf2ca44c9\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.109398 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-utilities\") pod \"14189167-a619-4f76-8f74-f3cdf2ca44c9\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.109505 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-catalog-content\") pod \"14189167-a619-4f76-8f74-f3cdf2ca44c9\" (UID: \"14189167-a619-4f76-8f74-f3cdf2ca44c9\") " Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.111899 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-utilities" (OuterVolumeSpecName: "utilities") pod "14189167-a619-4f76-8f74-f3cdf2ca44c9" (UID: "14189167-a619-4f76-8f74-f3cdf2ca44c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.125815 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14189167-a619-4f76-8f74-f3cdf2ca44c9-kube-api-access-5p5pq" (OuterVolumeSpecName: "kube-api-access-5p5pq") pod "14189167-a619-4f76-8f74-f3cdf2ca44c9" (UID: "14189167-a619-4f76-8f74-f3cdf2ca44c9"). InnerVolumeSpecName "kube-api-access-5p5pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.149682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g6m5" event={"ID":"14189167-a619-4f76-8f74-f3cdf2ca44c9","Type":"ContainerDied","Data":"52468e3a7aba70fb0f97497d636a87f28801aea629a1e346b65a468c096fcbb7"} Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.149735 4792 scope.go:117] "RemoveContainer" containerID="703b2ad4db989bac2dec66061862e328345c917db913de4aaa230dd70132b1da" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.149900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g6m5" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.207661 4792 scope.go:117] "RemoveContainer" containerID="6e6a7e0b00e09b384ab6e0aca0e568fcb2a04029b64e2eb5769206d8684d65a4" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.213754 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.213789 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p5pq\" (UniqueName: \"kubernetes.io/projected/14189167-a619-4f76-8f74-f3cdf2ca44c9-kube-api-access-5p5pq\") on node \"crc\" DevicePath \"\"" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.234244 4792 scope.go:117] "RemoveContainer" containerID="69138f83fa99064086ef1724815f82fe445e5654c11da06d5a42704f396e8363" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.313039 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14189167-a619-4f76-8f74-f3cdf2ca44c9" (UID: "14189167-a619-4f76-8f74-f3cdf2ca44c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.321536 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14189167-a619-4f76-8f74-f3cdf2ca44c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.509171 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7g6m5"] Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.521944 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7g6m5"] Mar 18 16:57:09 crc kubenswrapper[4792]: I0318 16:57:09.870913 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" path="/var/lib/kubelet/pods/14189167-a619-4f76-8f74-f3cdf2ca44c9/volumes" Mar 18 16:57:30 crc kubenswrapper[4792]: I0318 16:57:30.322184 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:57:30 crc kubenswrapper[4792]: I0318 16:57:30.322811 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.271155 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564218-8dbhb"] Mar 18 16:58:00 crc kubenswrapper[4792]: E0318 16:58:00.281117 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="extract-content" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.281175 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="extract-content" Mar 18 16:58:00 crc kubenswrapper[4792]: E0318 16:58:00.281230 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ab00b2-385f-4a41-bf3b-3a85c8121d3f" containerName="oc" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.281241 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ab00b2-385f-4a41-bf3b-3a85c8121d3f" containerName="oc" Mar 18 16:58:00 crc kubenswrapper[4792]: E0318 16:58:00.281262 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.281271 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4792]: E0318 16:58:00.281305 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="extract-utilities" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.281313 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="extract-utilities" Mar 18 16:58:00 crc kubenswrapper[4792]: E0318 16:58:00.281352 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="extract-utilities" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.281361 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="extract-utilities" Mar 18 16:58:00 crc kubenswrapper[4792]: E0318 16:58:00.281370 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="extract-content" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.281378 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="extract-content" Mar 18 16:58:00 crc kubenswrapper[4792]: E0318 16:58:00.281404 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.281412 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.284225 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6d2cbc-a349-4bdc-918f-bb29c6b50df6" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.284283 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.284350 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ab00b2-385f-4a41-bf3b-3a85c8121d3f" containerName="oc" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.291810 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.303107 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.303689 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.304134 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.322213 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.322264 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.322571 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-8dbhb"] Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.367346 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mrw\" (UniqueName: \"kubernetes.io/projected/705b0029-d670-4d05-a5bb-bf897b84882c-kube-api-access-74mrw\") pod \"auto-csr-approver-29564218-8dbhb\" (UID: \"705b0029-d670-4d05-a5bb-bf897b84882c\") " pod="openshift-infra/auto-csr-approver-29564218-8dbhb" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.470610 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74mrw\" (UniqueName: \"kubernetes.io/projected/705b0029-d670-4d05-a5bb-bf897b84882c-kube-api-access-74mrw\") pod \"auto-csr-approver-29564218-8dbhb\" (UID: \"705b0029-d670-4d05-a5bb-bf897b84882c\") " pod="openshift-infra/auto-csr-approver-29564218-8dbhb" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.497346 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mrw\" (UniqueName: \"kubernetes.io/projected/705b0029-d670-4d05-a5bb-bf897b84882c-kube-api-access-74mrw\") pod \"auto-csr-approver-29564218-8dbhb\" (UID: \"705b0029-d670-4d05-a5bb-bf897b84882c\") " pod="openshift-infra/auto-csr-approver-29564218-8dbhb" Mar 18 16:58:00 crc kubenswrapper[4792]: I0318 16:58:00.629394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" Mar 18 16:58:01 crc kubenswrapper[4792]: I0318 16:58:01.233339 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-8dbhb"] Mar 18 16:58:01 crc kubenswrapper[4792]: I0318 16:58:01.985051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" event={"ID":"705b0029-d670-4d05-a5bb-bf897b84882c","Type":"ContainerStarted","Data":"843d533d45a6f86b9156e8c37587ebdf79b238dba00362709820651286b289e0"} Mar 18 16:58:04 crc kubenswrapper[4792]: I0318 16:58:04.017895 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" event={"ID":"705b0029-d670-4d05-a5bb-bf897b84882c","Type":"ContainerStarted","Data":"b2b59e9f065ce04ec23aed25f0f72dfe105cc737f741ac62679172c716d95175"} Mar 18 16:58:04 crc kubenswrapper[4792]: I0318 16:58:04.046047 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" podStartSLOduration=2.869063465 podStartE2EDuration="4.04520019s" podCreationTimestamp="2026-03-18 16:58:00 +0000 UTC" firstStartedPulling="2026-03-18 16:58:01.258334144 +0000 UTC m=+5030.127663091" lastFinishedPulling="2026-03-18 16:58:02.434470879 +0000 UTC m=+5031.303799816" observedRunningTime="2026-03-18 16:58:04.043191446 +0000 UTC m=+5032.912520393" watchObservedRunningTime="2026-03-18 16:58:04.04520019 +0000 UTC m=+5032.914529127" Mar 18 16:58:05 crc kubenswrapper[4792]: I0318 16:58:05.032493 4792 generic.go:334] "Generic (PLEG): container finished" podID="705b0029-d670-4d05-a5bb-bf897b84882c" containerID="b2b59e9f065ce04ec23aed25f0f72dfe105cc737f741ac62679172c716d95175" exitCode=0 Mar 18 16:58:05 crc kubenswrapper[4792]: I0318 16:58:05.032808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" event={"ID":"705b0029-d670-4d05-a5bb-bf897b84882c","Type":"ContainerDied","Data":"b2b59e9f065ce04ec23aed25f0f72dfe105cc737f741ac62679172c716d95175"} Mar 18 16:58:06 crc kubenswrapper[4792]: I0318 16:58:06.494212 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" Mar 18 16:58:06 crc kubenswrapper[4792]: I0318 16:58:06.624660 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74mrw\" (UniqueName: \"kubernetes.io/projected/705b0029-d670-4d05-a5bb-bf897b84882c-kube-api-access-74mrw\") pod \"705b0029-d670-4d05-a5bb-bf897b84882c\" (UID: \"705b0029-d670-4d05-a5bb-bf897b84882c\") " Mar 18 16:58:06 crc kubenswrapper[4792]: I0318 16:58:06.630930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705b0029-d670-4d05-a5bb-bf897b84882c-kube-api-access-74mrw" (OuterVolumeSpecName: "kube-api-access-74mrw") pod "705b0029-d670-4d05-a5bb-bf897b84882c" (UID: "705b0029-d670-4d05-a5bb-bf897b84882c"). InnerVolumeSpecName "kube-api-access-74mrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:58:06 crc kubenswrapper[4792]: I0318 16:58:06.728413 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74mrw\" (UniqueName: \"kubernetes.io/projected/705b0029-d670-4d05-a5bb-bf897b84882c-kube-api-access-74mrw\") on node \"crc\" DevicePath \"\"" Mar 18 16:58:07 crc kubenswrapper[4792]: I0318 16:58:07.059487 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" event={"ID":"705b0029-d670-4d05-a5bb-bf897b84882c","Type":"ContainerDied","Data":"843d533d45a6f86b9156e8c37587ebdf79b238dba00362709820651286b289e0"} Mar 18 16:58:07 crc kubenswrapper[4792]: I0318 16:58:07.059816 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="843d533d45a6f86b9156e8c37587ebdf79b238dba00362709820651286b289e0" Mar 18 16:58:07 crc kubenswrapper[4792]: I0318 16:58:07.059580 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564218-8dbhb" Mar 18 16:58:07 crc kubenswrapper[4792]: I0318 16:58:07.132089 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-c99b6"] Mar 18 16:58:07 crc kubenswrapper[4792]: I0318 16:58:07.143657 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564212-c99b6"] Mar 18 16:58:07 crc kubenswrapper[4792]: I0318 16:58:07.868916 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b4719d-5356-4698-bc46-11e4df4fe32a" path="/var/lib/kubelet/pods/64b4719d-5356-4698-bc46-11e4df4fe32a/volumes" Mar 18 16:58:30 crc kubenswrapper[4792]: I0318 16:58:30.322499 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 16:58:30 crc kubenswrapper[4792]: I0318 16:58:30.323028 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 16:58:30 crc kubenswrapper[4792]: I0318 16:58:30.323083 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 16:58:30 crc kubenswrapper[4792]: I0318 16:58:30.324056 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 16:58:30 crc kubenswrapper[4792]: I0318 16:58:30.324122 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" gracePeriod=600 Mar 18 16:58:30 crc kubenswrapper[4792]: E0318 16:58:30.446290 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:58:31 crc kubenswrapper[4792]: I0318 16:58:31.331731 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" exitCode=0 Mar 18 16:58:31 crc kubenswrapper[4792]: I0318 16:58:31.331806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd"} Mar 18 16:58:31 crc kubenswrapper[4792]: I0318 16:58:31.333158 4792 scope.go:117] "RemoveContainer" containerID="a6f1c178a409c5f0355d650182975948a9c12e1883a5050caf7fccad5d287466" Mar 18 16:58:31 crc kubenswrapper[4792]: I0318 16:58:31.333990 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 16:58:31 crc kubenswrapper[4792]: E0318 16:58:31.334312 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:58:42 crc kubenswrapper[4792]: I0318 16:58:42.854473 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 16:58:42 crc kubenswrapper[4792]: E0318 16:58:42.855204 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:58:55 crc kubenswrapper[4792]: I0318 16:58:55.854553 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 16:58:55 crc kubenswrapper[4792]: E0318 16:58:55.855445 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:58:58 crc kubenswrapper[4792]: I0318 16:58:58.031480 4792 scope.go:117] "RemoveContainer" containerID="64de97ad6257cea44b2f346c80ffd4de370764dadccc41936ad070b27ce98dda" Mar 18 16:59:09 crc kubenswrapper[4792]: I0318 16:59:09.854525 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 16:59:09 crc kubenswrapper[4792]: E0318 16:59:09.855462 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.416202 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pzvsm"] Mar 18 16:59:12 crc kubenswrapper[4792]: E0318 16:59:12.420004 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.420035 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" Mar 18 16:59:12 crc kubenswrapper[4792]: E0318 16:59:12.420107 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705b0029-d670-4d05-a5bb-bf897b84882c" containerName="oc" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.420114 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="705b0029-d670-4d05-a5bb-bf897b84882c" containerName="oc" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.420684 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="705b0029-d670-4d05-a5bb-bf897b84882c" containerName="oc" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.420709 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="14189167-a619-4f76-8f74-f3cdf2ca44c9" containerName="registry-server" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.429975 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.455471 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pzvsm"] Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.539922 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-utilities\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.539977 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-catalog-content\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.540159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn4pw\" (UniqueName: \"kubernetes.io/projected/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-kube-api-access-gn4pw\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.641906 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-utilities\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.641954 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-catalog-content\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.642044 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn4pw\" (UniqueName: \"kubernetes.io/projected/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-kube-api-access-gn4pw\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.642596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-utilities\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.643363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-catalog-content\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.665836 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn4pw\" (UniqueName: \"kubernetes.io/projected/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-kube-api-access-gn4pw\") pod \"certified-operators-pzvsm\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:12 crc kubenswrapper[4792]: I0318 16:59:12.773960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:13 crc kubenswrapper[4792]: I0318 16:59:13.314320 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pzvsm"] Mar 18 16:59:13 crc kubenswrapper[4792]: I0318 16:59:13.797639 4792 generic.go:334] "Generic (PLEG): container finished" podID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerID="2a45937c074f85d0364e6392f76fcb7f328877ee803cff70e4e280b87377f3b4" exitCode=0 Mar 18 16:59:13 crc kubenswrapper[4792]: I0318 16:59:13.797705 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzvsm" event={"ID":"e4edabee-6e00-40ea-a65e-a0cf2e0b3586","Type":"ContainerDied","Data":"2a45937c074f85d0364e6392f76fcb7f328877ee803cff70e4e280b87377f3b4"} Mar 18 16:59:13 crc kubenswrapper[4792]: I0318 16:59:13.797942 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzvsm" event={"ID":"e4edabee-6e00-40ea-a65e-a0cf2e0b3586","Type":"ContainerStarted","Data":"1158c8921926f1560a5978920fff5df6834351a0ae8d16e7b49de6aef37fb288"} Mar 18 16:59:13 crc kubenswrapper[4792]: I0318 16:59:13.799885 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:59:14 crc kubenswrapper[4792]: I0318 16:59:14.810302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzvsm" event={"ID":"e4edabee-6e00-40ea-a65e-a0cf2e0b3586","Type":"ContainerStarted","Data":"4fb4413e7b5d6d785ad41bab411759a934f4c844903154e29b26cb487adc8ae0"} Mar 18 16:59:16 crc kubenswrapper[4792]: I0318 16:59:16.837132 4792 generic.go:334] "Generic (PLEG): container finished" podID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerID="4fb4413e7b5d6d785ad41bab411759a934f4c844903154e29b26cb487adc8ae0" exitCode=0 Mar 18 16:59:16 crc kubenswrapper[4792]: I0318 16:59:16.837630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzvsm" event={"ID":"e4edabee-6e00-40ea-a65e-a0cf2e0b3586","Type":"ContainerDied","Data":"4fb4413e7b5d6d785ad41bab411759a934f4c844903154e29b26cb487adc8ae0"} Mar 18 16:59:18 crc kubenswrapper[4792]: I0318 16:59:18.874282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzvsm" event={"ID":"e4edabee-6e00-40ea-a65e-a0cf2e0b3586","Type":"ContainerStarted","Data":"754760a40ef45245946897197bb8573fca959bd6373573780388c55532638453"} Mar 18 16:59:18 crc kubenswrapper[4792]: I0318 16:59:18.911341 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pzvsm" podStartSLOduration=3.294583894 podStartE2EDuration="6.911319387s" podCreationTimestamp="2026-03-18 16:59:12 +0000 UTC" firstStartedPulling="2026-03-18 16:59:13.799611753 +0000 UTC m=+5102.668940690" lastFinishedPulling="2026-03-18 16:59:17.416347246 +0000 UTC m=+5106.285676183" observedRunningTime="2026-03-18 16:59:18.898310773 +0000 UTC m=+5107.767639710" watchObservedRunningTime="2026-03-18 16:59:18.911319387 +0000 UTC m=+5107.780648324" Mar 18 16:59:22 crc kubenswrapper[4792]: I0318 16:59:22.774152 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:22 crc kubenswrapper[4792]: I0318 16:59:22.775824 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:22 crc kubenswrapper[4792]: I0318 16:59:22.854813 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 16:59:22 crc kubenswrapper[4792]: E0318 16:59:22.855193 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:59:23 crc kubenswrapper[4792]: I0318 16:59:23.858304 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pzvsm" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="registry-server" probeResult="failure" output=< Mar 18 16:59:23 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:59:23 crc kubenswrapper[4792]: > Mar 18 16:59:33 crc kubenswrapper[4792]: I0318 16:59:33.835471 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pzvsm" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="registry-server" probeResult="failure" output=< Mar 18 16:59:33 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 16:59:33 crc kubenswrapper[4792]: > Mar 18 16:59:37 crc kubenswrapper[4792]: I0318 16:59:37.854725 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 16:59:37 crc kubenswrapper[4792]: E0318 16:59:37.855553 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 16:59:42 crc kubenswrapper[4792]: I0318 16:59:42.979736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:43 crc kubenswrapper[4792]: I0318 16:59:43.041700 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:45 crc kubenswrapper[4792]: I0318 16:59:45.915676 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pzvsm"] Mar 18 16:59:45 crc kubenswrapper[4792]: I0318 16:59:45.916388 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pzvsm" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="registry-server" containerID="cri-o://754760a40ef45245946897197bb8573fca959bd6373573780388c55532638453" gracePeriod=2 Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.241067 4792 generic.go:334] "Generic (PLEG): container finished" podID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerID="754760a40ef45245946897197bb8573fca959bd6373573780388c55532638453" exitCode=0 Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.241431 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzvsm" event={"ID":"e4edabee-6e00-40ea-a65e-a0cf2e0b3586","Type":"ContainerDied","Data":"754760a40ef45245946897197bb8573fca959bd6373573780388c55532638453"} Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.540688 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.586006 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-catalog-content\") pod \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.586252 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn4pw\" (UniqueName: \"kubernetes.io/projected/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-kube-api-access-gn4pw\") pod \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.586420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-utilities\") pod \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\" (UID: \"e4edabee-6e00-40ea-a65e-a0cf2e0b3586\") " Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.586918 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-utilities" (OuterVolumeSpecName: "utilities") pod "e4edabee-6e00-40ea-a65e-a0cf2e0b3586" (UID: "e4edabee-6e00-40ea-a65e-a0cf2e0b3586"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.588102 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.595361 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-kube-api-access-gn4pw" (OuterVolumeSpecName: "kube-api-access-gn4pw") pod "e4edabee-6e00-40ea-a65e-a0cf2e0b3586" (UID: "e4edabee-6e00-40ea-a65e-a0cf2e0b3586"). InnerVolumeSpecName "kube-api-access-gn4pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.654540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4edabee-6e00-40ea-a65e-a0cf2e0b3586" (UID: "e4edabee-6e00-40ea-a65e-a0cf2e0b3586"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.690807 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 16:59:46 crc kubenswrapper[4792]: I0318 16:59:46.691215 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn4pw\" (UniqueName: \"kubernetes.io/projected/e4edabee-6e00-40ea-a65e-a0cf2e0b3586-kube-api-access-gn4pw\") on node \"crc\" DevicePath \"\"" Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.267145 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzvsm" event={"ID":"e4edabee-6e00-40ea-a65e-a0cf2e0b3586","Type":"ContainerDied","Data":"1158c8921926f1560a5978920fff5df6834351a0ae8d16e7b49de6aef37fb288"} Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.267198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzvsm" Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.268094 4792 scope.go:117] "RemoveContainer" containerID="754760a40ef45245946897197bb8573fca959bd6373573780388c55532638453" Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.307646 4792 scope.go:117] "RemoveContainer" containerID="4fb4413e7b5d6d785ad41bab411759a934f4c844903154e29b26cb487adc8ae0" Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.311926 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pzvsm"] Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.324070 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pzvsm"] Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.338917 4792 scope.go:117] "RemoveContainer" containerID="2a45937c074f85d0364e6392f76fcb7f328877ee803cff70e4e280b87377f3b4" Mar 18 16:59:47 crc kubenswrapper[4792]: I0318 16:59:47.870710 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" path="/var/lib/kubelet/pods/e4edabee-6e00-40ea-a65e-a0cf2e0b3586/volumes" Mar 18 16:59:48 crc kubenswrapper[4792]: I0318 16:59:48.855223 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 16:59:48 crc kubenswrapper[4792]: E0318 16:59:48.855580 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.163863 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564220-gwnn4"] Mar 18 17:00:00 crc kubenswrapper[4792]: E0318 17:00:00.165126 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="extract-content" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.165144 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="extract-content" Mar 18 17:00:00 crc kubenswrapper[4792]: E0318 17:00:00.165171 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="extract-utilities" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.165181 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="extract-utilities" Mar 18 17:00:00 crc kubenswrapper[4792]: E0318 17:00:00.165207 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="registry-server" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.165213 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="registry-server" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.165482 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4edabee-6e00-40ea-a65e-a0cf2e0b3586" containerName="registry-server" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.166436 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.168883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.168910 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.169102 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.183432 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-gwnn4"] Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.257823 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf"] Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.260322 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.273570 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.273756 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.299781 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf"] Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.342445 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szwwf\" (UniqueName: \"kubernetes.io/projected/98f23771-7d72-4164-936b-e1b35d09bda4-kube-api-access-szwwf\") pod \"auto-csr-approver-29564220-gwnn4\" (UID: \"98f23771-7d72-4164-936b-e1b35d09bda4\") " pod="openshift-infra/auto-csr-approver-29564220-gwnn4" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.445269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b9c0d99-d9cd-48b9-a400-69d88693e95f-config-volume\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.445333 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zxwc\" (UniqueName: \"kubernetes.io/projected/3b9c0d99-d9cd-48b9-a400-69d88693e95f-kube-api-access-9zxwc\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.445493 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b9c0d99-d9cd-48b9-a400-69d88693e95f-secret-volume\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.445646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szwwf\" (UniqueName: \"kubernetes.io/projected/98f23771-7d72-4164-936b-e1b35d09bda4-kube-api-access-szwwf\") pod \"auto-csr-approver-29564220-gwnn4\" (UID: \"98f23771-7d72-4164-936b-e1b35d09bda4\") " pod="openshift-infra/auto-csr-approver-29564220-gwnn4" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.467550 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szwwf\" (UniqueName: \"kubernetes.io/projected/98f23771-7d72-4164-936b-e1b35d09bda4-kube-api-access-szwwf\") pod \"auto-csr-approver-29564220-gwnn4\" (UID: \"98f23771-7d72-4164-936b-e1b35d09bda4\") " pod="openshift-infra/auto-csr-approver-29564220-gwnn4" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.490886 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.547890 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b9c0d99-d9cd-48b9-a400-69d88693e95f-secret-volume\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.548580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b9c0d99-d9cd-48b9-a400-69d88693e95f-config-volume\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.548635 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zxwc\" (UniqueName: \"kubernetes.io/projected/3b9c0d99-d9cd-48b9-a400-69d88693e95f-kube-api-access-9zxwc\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.550123 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b9c0d99-d9cd-48b9-a400-69d88693e95f-config-volume\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.553607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b9c0d99-d9cd-48b9-a400-69d88693e95f-secret-volume\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.570087 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zxwc\" (UniqueName: \"kubernetes.io/projected/3b9c0d99-d9cd-48b9-a400-69d88693e95f-kube-api-access-9zxwc\") pod \"collect-profiles-29564220-xrxpf\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:00 crc kubenswrapper[4792]: I0318 17:00:00.589198 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:01 crc kubenswrapper[4792]: I0318 17:00:01.080559 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-gwnn4"] Mar 18 17:00:01 crc kubenswrapper[4792]: I0318 17:00:01.221609 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf"] Mar 18 17:00:01 crc kubenswrapper[4792]: W0318 17:00:01.224800 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b9c0d99_d9cd_48b9_a400_69d88693e95f.slice/crio-fcc30da0ab75b97b55e8ceb3119b463415a2923eba775161d1c4d56daa8e7c42 WatchSource:0}: Error finding container fcc30da0ab75b97b55e8ceb3119b463415a2923eba775161d1c4d56daa8e7c42: Status 404 returned error can't find the container with id fcc30da0ab75b97b55e8ceb3119b463415a2923eba775161d1c4d56daa8e7c42 Mar 18 17:00:01 crc kubenswrapper[4792]: I0318 17:00:01.424547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" event={"ID":"98f23771-7d72-4164-936b-e1b35d09bda4","Type":"ContainerStarted","Data":"470d245b16b298e797ba49d396f8f774842b8be4c97e30760238aa050699665c"} Mar 18 17:00:01 crc kubenswrapper[4792]: I0318 17:00:01.427563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" event={"ID":"3b9c0d99-d9cd-48b9-a400-69d88693e95f","Type":"ContainerStarted","Data":"b7062e812cb9e2b835fb22302a60d4e312d18ce4d8cbec8e2073dc17a213553d"} Mar 18 17:00:01 crc kubenswrapper[4792]: I0318 17:00:01.427626 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" event={"ID":"3b9c0d99-d9cd-48b9-a400-69d88693e95f","Type":"ContainerStarted","Data":"fcc30da0ab75b97b55e8ceb3119b463415a2923eba775161d1c4d56daa8e7c42"} Mar 18 17:00:01 crc kubenswrapper[4792]: I0318 17:00:01.448817 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" podStartSLOduration=1.448797782 podStartE2EDuration="1.448797782s" podCreationTimestamp="2026-03-18 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:00:01.445035422 +0000 UTC m=+5150.314364379" watchObservedRunningTime="2026-03-18 17:00:01.448797782 +0000 UTC m=+5150.318126719" Mar 18 17:00:02 crc kubenswrapper[4792]: I0318 17:00:02.443061 4792 generic.go:334] "Generic (PLEG): container finished" podID="3b9c0d99-d9cd-48b9-a400-69d88693e95f" containerID="b7062e812cb9e2b835fb22302a60d4e312d18ce4d8cbec8e2073dc17a213553d" exitCode=0 Mar 18 17:00:02 crc kubenswrapper[4792]: I0318 17:00:02.443318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" event={"ID":"3b9c0d99-d9cd-48b9-a400-69d88693e95f","Type":"ContainerDied","Data":"b7062e812cb9e2b835fb22302a60d4e312d18ce4d8cbec8e2073dc17a213553d"} Mar 18 17:00:03 crc kubenswrapper[4792]: I0318 17:00:03.856228 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:00:03 crc kubenswrapper[4792]: E0318 17:00:03.856926 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.383066 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.455803 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b9c0d99-d9cd-48b9-a400-69d88693e95f-secret-volume\") pod \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.456185 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b9c0d99-d9cd-48b9-a400-69d88693e95f-config-volume\") pod \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.456238 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zxwc\" (UniqueName: \"kubernetes.io/projected/3b9c0d99-d9cd-48b9-a400-69d88693e95f-kube-api-access-9zxwc\") pod \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\" (UID: \"3b9c0d99-d9cd-48b9-a400-69d88693e95f\") " Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.457101 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9c0d99-d9cd-48b9-a400-69d88693e95f-config-volume" (OuterVolumeSpecName: "config-volume") pod "3b9c0d99-d9cd-48b9-a400-69d88693e95f" (UID: "3b9c0d99-d9cd-48b9-a400-69d88693e95f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.459847 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b9c0d99-d9cd-48b9-a400-69d88693e95f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.471815 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9c0d99-d9cd-48b9-a400-69d88693e95f-kube-api-access-9zxwc" (OuterVolumeSpecName: "kube-api-access-9zxwc") pod "3b9c0d99-d9cd-48b9-a400-69d88693e95f" (UID: "3b9c0d99-d9cd-48b9-a400-69d88693e95f"). InnerVolumeSpecName "kube-api-access-9zxwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.478002 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b9c0d99-d9cd-48b9-a400-69d88693e95f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3b9c0d99-d9cd-48b9-a400-69d88693e95f" (UID: "3b9c0d99-d9cd-48b9-a400-69d88693e95f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.489898 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" event={"ID":"3b9c0d99-d9cd-48b9-a400-69d88693e95f","Type":"ContainerDied","Data":"fcc30da0ab75b97b55e8ceb3119b463415a2923eba775161d1c4d56daa8e7c42"} Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.490212 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc30da0ab75b97b55e8ceb3119b463415a2923eba775161d1c4d56daa8e7c42" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.490053 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564220-xrxpf" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.553895 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv"] Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.562019 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zxwc\" (UniqueName: \"kubernetes.io/projected/3b9c0d99-d9cd-48b9-a400-69d88693e95f-kube-api-access-9zxwc\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.562255 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b9c0d99-d9cd-48b9-a400-69d88693e95f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:04 crc kubenswrapper[4792]: I0318 17:00:04.566097 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564175-2w9pv"] Mar 18 17:00:05 crc kubenswrapper[4792]: I0318 17:00:05.501321 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" event={"ID":"98f23771-7d72-4164-936b-e1b35d09bda4","Type":"ContainerStarted","Data":"b635e8276e1a5a1637bf307d03106e8e8a81579a01993f1d69b692c083448d3f"} Mar 18 17:00:05 crc kubenswrapper[4792]: I0318 17:00:05.523529 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" podStartSLOduration=3.351092229 podStartE2EDuration="5.523511116s" podCreationTimestamp="2026-03-18 17:00:00 +0000 UTC" firstStartedPulling="2026-03-18 17:00:01.085959165 +0000 UTC m=+5149.955288102" lastFinishedPulling="2026-03-18 17:00:03.258378052 +0000 UTC m=+5152.127706989" observedRunningTime="2026-03-18 17:00:05.5154777 +0000 UTC m=+5154.384806657" watchObservedRunningTime="2026-03-18 17:00:05.523511116 +0000 UTC m=+5154.392840053" Mar 18 17:00:05 crc kubenswrapper[4792]: I0318 17:00:05.871477 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa5c39e-0993-45e7-9a44-e392347f3c05" path="/var/lib/kubelet/pods/1fa5c39e-0993-45e7-9a44-e392347f3c05/volumes" Mar 18 17:00:06 crc kubenswrapper[4792]: I0318 17:00:06.514553 4792 generic.go:334] "Generic (PLEG): container finished" podID="98f23771-7d72-4164-936b-e1b35d09bda4" containerID="b635e8276e1a5a1637bf307d03106e8e8a81579a01993f1d69b692c083448d3f" exitCode=0 Mar 18 17:00:06 crc kubenswrapper[4792]: I0318 17:00:06.514637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" event={"ID":"98f23771-7d72-4164-936b-e1b35d09bda4","Type":"ContainerDied","Data":"b635e8276e1a5a1637bf307d03106e8e8a81579a01993f1d69b692c083448d3f"} Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.010606 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.053376 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szwwf\" (UniqueName: \"kubernetes.io/projected/98f23771-7d72-4164-936b-e1b35d09bda4-kube-api-access-szwwf\") pod \"98f23771-7d72-4164-936b-e1b35d09bda4\" (UID: \"98f23771-7d72-4164-936b-e1b35d09bda4\") " Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.060474 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f23771-7d72-4164-936b-e1b35d09bda4-kube-api-access-szwwf" (OuterVolumeSpecName: "kube-api-access-szwwf") pod "98f23771-7d72-4164-936b-e1b35d09bda4" (UID: "98f23771-7d72-4164-936b-e1b35d09bda4"). InnerVolumeSpecName "kube-api-access-szwwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.157059 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szwwf\" (UniqueName: \"kubernetes.io/projected/98f23771-7d72-4164-936b-e1b35d09bda4-kube-api-access-szwwf\") on node \"crc\" DevicePath \"\"" Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.538658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" event={"ID":"98f23771-7d72-4164-936b-e1b35d09bda4","Type":"ContainerDied","Data":"470d245b16b298e797ba49d396f8f774842b8be4c97e30760238aa050699665c"} Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.538713 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="470d245b16b298e797ba49d396f8f774842b8be4c97e30760238aa050699665c" Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.538731 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564220-gwnn4" Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.590865 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-nbg97"] Mar 18 17:00:08 crc kubenswrapper[4792]: I0318 17:00:08.603370 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564214-nbg97"] Mar 18 17:00:09 crc kubenswrapper[4792]: I0318 17:00:09.867819 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a2888c-51a4-4f58-8bab-671f62c3c29f" path="/var/lib/kubelet/pods/61a2888c-51a4-4f58-8bab-671f62c3c29f/volumes" Mar 18 17:00:18 crc kubenswrapper[4792]: I0318 17:00:18.854832 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:00:18 crc kubenswrapper[4792]: E0318 17:00:18.855806 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:00:31 crc kubenswrapper[4792]: I0318 17:00:31.864146 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:00:31 crc kubenswrapper[4792]: E0318 17:00:31.865039 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:00:43 crc kubenswrapper[4792]: I0318 17:00:43.854450 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:00:43 crc kubenswrapper[4792]: E0318 17:00:43.856480 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:00:58 crc kubenswrapper[4792]: I0318 17:00:58.183570 4792 scope.go:117] "RemoveContainer" containerID="dbfbe879da63272be5dfd0faad37ea9c110e9f8cd4dc8da4b1f7b620cac5e1bb" Mar 18 17:00:58 crc kubenswrapper[4792]: I0318 17:00:58.249528 4792 scope.go:117] "RemoveContainer" containerID="7f1ecc138b3c6b07033d7f5569f04e187ddebd4069346601b77f8a9aee767dd2" Mar 18 17:00:58 crc kubenswrapper[4792]: I0318 17:00:58.855041 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:00:58 crc kubenswrapper[4792]: E0318 17:00:58.855493 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.164212 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564221-pvzkd"] Mar 18 17:01:00 crc kubenswrapper[4792]: E0318 17:01:00.165189 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f23771-7d72-4164-936b-e1b35d09bda4" containerName="oc" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.165210 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f23771-7d72-4164-936b-e1b35d09bda4" containerName="oc" Mar 18 17:01:00 crc kubenswrapper[4792]: E0318 17:01:00.165259 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9c0d99-d9cd-48b9-a400-69d88693e95f" containerName="collect-profiles" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.165270 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9c0d99-d9cd-48b9-a400-69d88693e95f" containerName="collect-profiles" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.165569 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9c0d99-d9cd-48b9-a400-69d88693e95f" containerName="collect-profiles" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.165596 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f23771-7d72-4164-936b-e1b35d09bda4" containerName="oc" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.166685 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.186883 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564221-pvzkd"] Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.265885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfzn\" (UniqueName: \"kubernetes.io/projected/cd3213a9-53e0-4373-b606-2e7166eb8e26-kube-api-access-ljfzn\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.266174 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-combined-ca-bundle\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.266309 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-fernet-keys\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.266328 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-config-data\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.368790 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfzn\" (UniqueName: \"kubernetes.io/projected/cd3213a9-53e0-4373-b606-2e7166eb8e26-kube-api-access-ljfzn\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.368853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-combined-ca-bundle\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.369043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-fernet-keys\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.369064 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-config-data\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.378902 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-fernet-keys\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.379258 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-combined-ca-bundle\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.386150 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-config-data\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.386859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfzn\" (UniqueName: \"kubernetes.io/projected/cd3213a9-53e0-4373-b606-2e7166eb8e26-kube-api-access-ljfzn\") pod \"keystone-cron-29564221-pvzkd\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:00 crc kubenswrapper[4792]: I0318 17:01:00.497627 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:01 crc kubenswrapper[4792]: W0318 17:01:01.048234 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd3213a9_53e0_4373_b606_2e7166eb8e26.slice/crio-b4eaa97b5f7dfa8f4ee2ac6306d66d1f9b47791a3d424b7322ceccfa0132be49 WatchSource:0}: Error finding container b4eaa97b5f7dfa8f4ee2ac6306d66d1f9b47791a3d424b7322ceccfa0132be49: Status 404 returned error can't find the container with id b4eaa97b5f7dfa8f4ee2ac6306d66d1f9b47791a3d424b7322ceccfa0132be49 Mar 18 17:01:01 crc kubenswrapper[4792]: I0318 17:01:01.048822 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564221-pvzkd"] Mar 18 17:01:01 crc kubenswrapper[4792]: I0318 17:01:01.114231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564221-pvzkd" event={"ID":"cd3213a9-53e0-4373-b606-2e7166eb8e26","Type":"ContainerStarted","Data":"b4eaa97b5f7dfa8f4ee2ac6306d66d1f9b47791a3d424b7322ceccfa0132be49"} Mar 18 17:01:02 crc kubenswrapper[4792]: I0318 17:01:02.128479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564221-pvzkd" event={"ID":"cd3213a9-53e0-4373-b606-2e7166eb8e26","Type":"ContainerStarted","Data":"f506a21dd5e5e73b15ddc7c46535b5a3510e169ad7ec86b3d8aba7e955ad815b"} Mar 18 17:01:02 crc kubenswrapper[4792]: I0318 17:01:02.150047 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564221-pvzkd" podStartSLOduration=2.150030557 podStartE2EDuration="2.150030557s" podCreationTimestamp="2026-03-18 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:01:02.144056458 +0000 UTC m=+5211.013385405" watchObservedRunningTime="2026-03-18 17:01:02.150030557 +0000 UTC m=+5211.019359494" Mar 18 17:01:05 crc kubenswrapper[4792]: I0318 17:01:05.171067 4792 generic.go:334] "Generic (PLEG): container finished" podID="cd3213a9-53e0-4373-b606-2e7166eb8e26" containerID="f506a21dd5e5e73b15ddc7c46535b5a3510e169ad7ec86b3d8aba7e955ad815b" exitCode=0 Mar 18 17:01:05 crc kubenswrapper[4792]: I0318 17:01:05.171170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564221-pvzkd" event={"ID":"cd3213a9-53e0-4373-b606-2e7166eb8e26","Type":"ContainerDied","Data":"f506a21dd5e5e73b15ddc7c46535b5a3510e169ad7ec86b3d8aba7e955ad815b"} Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.668652 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.749130 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-fernet-keys\") pod \"cd3213a9-53e0-4373-b606-2e7166eb8e26\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.749423 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljfzn\" (UniqueName: \"kubernetes.io/projected/cd3213a9-53e0-4373-b606-2e7166eb8e26-kube-api-access-ljfzn\") pod \"cd3213a9-53e0-4373-b606-2e7166eb8e26\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.749482 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-combined-ca-bundle\") pod \"cd3213a9-53e0-4373-b606-2e7166eb8e26\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.749649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-config-data\") pod \"cd3213a9-53e0-4373-b606-2e7166eb8e26\" (UID: \"cd3213a9-53e0-4373-b606-2e7166eb8e26\") " Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.831877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cd3213a9-53e0-4373-b606-2e7166eb8e26" (UID: "cd3213a9-53e0-4373-b606-2e7166eb8e26"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.832049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3213a9-53e0-4373-b606-2e7166eb8e26-kube-api-access-ljfzn" (OuterVolumeSpecName: "kube-api-access-ljfzn") pod "cd3213a9-53e0-4373-b606-2e7166eb8e26" (UID: "cd3213a9-53e0-4373-b606-2e7166eb8e26"). InnerVolumeSpecName "kube-api-access-ljfzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.836991 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd3213a9-53e0-4373-b606-2e7166eb8e26" (UID: "cd3213a9-53e0-4373-b606-2e7166eb8e26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.853222 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljfzn\" (UniqueName: \"kubernetes.io/projected/cd3213a9-53e0-4373-b606-2e7166eb8e26-kube-api-access-ljfzn\") on node \"crc\" DevicePath \"\"" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.853252 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.853262 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.868682 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-config-data" (OuterVolumeSpecName: "config-data") pod "cd3213a9-53e0-4373-b606-2e7166eb8e26" (UID: "cd3213a9-53e0-4373-b606-2e7166eb8e26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:01:06 crc kubenswrapper[4792]: I0318 17:01:06.955831 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3213a9-53e0-4373-b606-2e7166eb8e26-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:01:07 crc kubenswrapper[4792]: I0318 17:01:07.195724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564221-pvzkd" event={"ID":"cd3213a9-53e0-4373-b606-2e7166eb8e26","Type":"ContainerDied","Data":"b4eaa97b5f7dfa8f4ee2ac6306d66d1f9b47791a3d424b7322ceccfa0132be49"} Mar 18 17:01:07 crc kubenswrapper[4792]: I0318 17:01:07.196005 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4eaa97b5f7dfa8f4ee2ac6306d66d1f9b47791a3d424b7322ceccfa0132be49" Mar 18 17:01:07 crc kubenswrapper[4792]: I0318 17:01:07.195791 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564221-pvzkd" Mar 18 17:01:13 crc kubenswrapper[4792]: I0318 17:01:13.855864 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:01:13 crc kubenswrapper[4792]: E0318 17:01:13.856635 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:01:26 crc kubenswrapper[4792]: I0318 17:01:26.854225 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:01:26 crc kubenswrapper[4792]: E0318 17:01:26.855103 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:01:41 crc kubenswrapper[4792]: I0318 17:01:41.869853 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:01:41 crc kubenswrapper[4792]: E0318 17:01:41.871993 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:01:55 crc kubenswrapper[4792]: I0318 17:01:55.855158 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:01:55 crc kubenswrapper[4792]: E0318 17:01:55.856154 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.147403 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564222-nshkt"] Mar 18 17:02:00 crc kubenswrapper[4792]: E0318 17:02:00.148450 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3213a9-53e0-4373-b606-2e7166eb8e26" containerName="keystone-cron" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.148472 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3213a9-53e0-4373-b606-2e7166eb8e26" containerName="keystone-cron" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.148782 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3213a9-53e0-4373-b606-2e7166eb8e26" containerName="keystone-cron" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.149731 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-nshkt" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.152879 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.153176 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.154030 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.160508 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-nshkt"] Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.284609 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb4xz\" (UniqueName: \"kubernetes.io/projected/8deb8186-c72b-4eef-8547-cbebc8cb0985-kube-api-access-tb4xz\") pod \"auto-csr-approver-29564222-nshkt\" (UID: \"8deb8186-c72b-4eef-8547-cbebc8cb0985\") " pod="openshift-infra/auto-csr-approver-29564222-nshkt" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.386934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb4xz\" (UniqueName: \"kubernetes.io/projected/8deb8186-c72b-4eef-8547-cbebc8cb0985-kube-api-access-tb4xz\") pod \"auto-csr-approver-29564222-nshkt\" (UID: \"8deb8186-c72b-4eef-8547-cbebc8cb0985\") " pod="openshift-infra/auto-csr-approver-29564222-nshkt" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.408622 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb4xz\" (UniqueName: \"kubernetes.io/projected/8deb8186-c72b-4eef-8547-cbebc8cb0985-kube-api-access-tb4xz\") pod \"auto-csr-approver-29564222-nshkt\" (UID: \"8deb8186-c72b-4eef-8547-cbebc8cb0985\") " pod="openshift-infra/auto-csr-approver-29564222-nshkt" Mar 18 17:02:00 crc kubenswrapper[4792]: I0318 17:02:00.471361 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-nshkt" Mar 18 17:02:01 crc kubenswrapper[4792]: I0318 17:02:01.021397 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-nshkt"] Mar 18 17:02:02 crc kubenswrapper[4792]: I0318 17:02:02.030316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564222-nshkt" event={"ID":"8deb8186-c72b-4eef-8547-cbebc8cb0985","Type":"ContainerStarted","Data":"24d9e618b75cae9764bb92b2f3b8bbe496184937c86ae2103d32fd980200ce18"} Mar 18 17:02:03 crc kubenswrapper[4792]: I0318 17:02:03.042544 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564222-nshkt" event={"ID":"8deb8186-c72b-4eef-8547-cbebc8cb0985","Type":"ContainerStarted","Data":"879b3755e387efcba6ad0e2cf70bd9dceffed983ed73cd8d57ab2c32c69d8b92"} Mar 18 17:02:03 crc kubenswrapper[4792]: I0318 17:02:03.062260 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564222-nshkt" podStartSLOduration=1.967443452 podStartE2EDuration="3.062239956s" podCreationTimestamp="2026-03-18 17:02:00 +0000 UTC" firstStartedPulling="2026-03-18 17:02:01.44140563 +0000 UTC m=+5270.310734567" lastFinishedPulling="2026-03-18 17:02:02.536202134 +0000 UTC m=+5271.405531071" observedRunningTime="2026-03-18 17:02:03.056093702 +0000 UTC m=+5271.925422639" watchObservedRunningTime="2026-03-18 17:02:03.062239956 +0000 UTC m=+5271.931568893" Mar 18 17:02:04 crc kubenswrapper[4792]: I0318 17:02:04.065598 4792 generic.go:334] "Generic (PLEG): container finished" podID="8deb8186-c72b-4eef-8547-cbebc8cb0985" containerID="879b3755e387efcba6ad0e2cf70bd9dceffed983ed73cd8d57ab2c32c69d8b92" exitCode=0 Mar 18 17:02:04 crc kubenswrapper[4792]: I0318 17:02:04.065925 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564222-nshkt" event={"ID":"8deb8186-c72b-4eef-8547-cbebc8cb0985","Type":"ContainerDied","Data":"879b3755e387efcba6ad0e2cf70bd9dceffed983ed73cd8d57ab2c32c69d8b92"} Mar 18 17:02:05 crc kubenswrapper[4792]: I0318 17:02:05.516334 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-nshkt" Mar 18 17:02:05 crc kubenswrapper[4792]: I0318 17:02:05.651094 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb4xz\" (UniqueName: \"kubernetes.io/projected/8deb8186-c72b-4eef-8547-cbebc8cb0985-kube-api-access-tb4xz\") pod \"8deb8186-c72b-4eef-8547-cbebc8cb0985\" (UID: \"8deb8186-c72b-4eef-8547-cbebc8cb0985\") " Mar 18 17:02:05 crc kubenswrapper[4792]: I0318 17:02:05.656307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8deb8186-c72b-4eef-8547-cbebc8cb0985-kube-api-access-tb4xz" (OuterVolumeSpecName: "kube-api-access-tb4xz") pod "8deb8186-c72b-4eef-8547-cbebc8cb0985" (UID: "8deb8186-c72b-4eef-8547-cbebc8cb0985"). InnerVolumeSpecName "kube-api-access-tb4xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:02:05 crc kubenswrapper[4792]: I0318 17:02:05.754409 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb4xz\" (UniqueName: \"kubernetes.io/projected/8deb8186-c72b-4eef-8547-cbebc8cb0985-kube-api-access-tb4xz\") on node \"crc\" DevicePath \"\"" Mar 18 17:02:06 crc kubenswrapper[4792]: I0318 17:02:06.095483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564222-nshkt" event={"ID":"8deb8186-c72b-4eef-8547-cbebc8cb0985","Type":"ContainerDied","Data":"24d9e618b75cae9764bb92b2f3b8bbe496184937c86ae2103d32fd980200ce18"} Mar 18 17:02:06 crc kubenswrapper[4792]: I0318 17:02:06.095717 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24d9e618b75cae9764bb92b2f3b8bbe496184937c86ae2103d32fd980200ce18" Mar 18 17:02:06 crc kubenswrapper[4792]: I0318 17:02:06.095541 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564222-nshkt" Mar 18 17:02:06 crc kubenswrapper[4792]: I0318 17:02:06.127540 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-zn66p"] Mar 18 17:02:06 crc kubenswrapper[4792]: I0318 17:02:06.138844 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564216-zn66p"] Mar 18 17:02:06 crc kubenswrapper[4792]: I0318 17:02:06.855178 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:02:06 crc kubenswrapper[4792]: E0318 17:02:06.855553 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:02:07 crc kubenswrapper[4792]: I0318 17:02:07.867759 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ab00b2-385f-4a41-bf3b-3a85c8121d3f" path="/var/lib/kubelet/pods/23ab00b2-385f-4a41-bf3b-3a85c8121d3f/volumes" Mar 18 17:02:21 crc kubenswrapper[4792]: I0318 17:02:21.865348 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:02:21 crc kubenswrapper[4792]: E0318 17:02:21.869819 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:02:34 crc kubenswrapper[4792]: I0318 17:02:34.854214 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:02:34 crc kubenswrapper[4792]: E0318 17:02:34.855103 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:02:48 crc kubenswrapper[4792]: I0318 17:02:48.854946 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:02:48 crc kubenswrapper[4792]: E0318 17:02:48.855928 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:02:58 crc kubenswrapper[4792]: I0318 17:02:58.988264 4792 scope.go:117] "RemoveContainer" containerID="50850798bf3f86e8e3538901dfc510052e7affdfd5bbe72e9b23af0ad1991130" Mar 18 17:03:02 crc kubenswrapper[4792]: I0318 17:03:02.854086 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:03:02 crc kubenswrapper[4792]: E0318 17:03:02.855112 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:03:13 crc kubenswrapper[4792]: I0318 17:03:13.855357 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:03:13 crc kubenswrapper[4792]: E0318 17:03:13.856359 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:03:28 crc kubenswrapper[4792]: I0318 17:03:28.855133 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:03:28 crc kubenswrapper[4792]: E0318 17:03:28.856545 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:03:39 crc kubenswrapper[4792]: I0318 17:03:39.854084 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:03:40 crc kubenswrapper[4792]: I0318 17:03:40.163496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"35126d0f2b2005c3e3c76c00acfb58aebc53a6fafa2e1c9419db524bb039de0a"} Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.160280 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564224-jm7nf"] Mar 18 17:04:00 crc kubenswrapper[4792]: E0318 17:04:00.162117 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8deb8186-c72b-4eef-8547-cbebc8cb0985" containerName="oc" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.162142 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8deb8186-c72b-4eef-8547-cbebc8cb0985" containerName="oc" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.162492 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8deb8186-c72b-4eef-8547-cbebc8cb0985" containerName="oc" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.164725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-jm7nf" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.168885 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.169180 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.169256 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.198187 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-jm7nf"] Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.320103 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkjh\" (UniqueName: \"kubernetes.io/projected/2da3fb70-6060-4637-b00a-3591615d46e9-kube-api-access-qgkjh\") pod \"auto-csr-approver-29564224-jm7nf\" (UID: \"2da3fb70-6060-4637-b00a-3591615d46e9\") " pod="openshift-infra/auto-csr-approver-29564224-jm7nf" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.430205 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkjh\" (UniqueName: \"kubernetes.io/projected/2da3fb70-6060-4637-b00a-3591615d46e9-kube-api-access-qgkjh\") pod \"auto-csr-approver-29564224-jm7nf\" (UID: \"2da3fb70-6060-4637-b00a-3591615d46e9\") " pod="openshift-infra/auto-csr-approver-29564224-jm7nf" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.457008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkjh\" (UniqueName: \"kubernetes.io/projected/2da3fb70-6060-4637-b00a-3591615d46e9-kube-api-access-qgkjh\") pod \"auto-csr-approver-29564224-jm7nf\" (UID: \"2da3fb70-6060-4637-b00a-3591615d46e9\") " pod="openshift-infra/auto-csr-approver-29564224-jm7nf" Mar 18 17:04:00 crc kubenswrapper[4792]: I0318 17:04:00.499633 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-jm7nf" Mar 18 17:04:01 crc kubenswrapper[4792]: W0318 17:04:00.998642 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2da3fb70_6060_4637_b00a_3591615d46e9.slice/crio-4d40833ab93da8b330b3f3b26d92a3cc55d018c75b37beba749f58bc2139c04e WatchSource:0}: Error finding container 4d40833ab93da8b330b3f3b26d92a3cc55d018c75b37beba749f58bc2139c04e: Status 404 returned error can't find the container with id 4d40833ab93da8b330b3f3b26d92a3cc55d018c75b37beba749f58bc2139c04e Mar 18 17:04:01 crc kubenswrapper[4792]: I0318 17:04:01.000154 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-jm7nf"] Mar 18 17:04:01 crc kubenswrapper[4792]: I0318 17:04:01.434376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564224-jm7nf" event={"ID":"2da3fb70-6060-4637-b00a-3591615d46e9","Type":"ContainerStarted","Data":"4d40833ab93da8b330b3f3b26d92a3cc55d018c75b37beba749f58bc2139c04e"} Mar 18 17:04:03 crc kubenswrapper[4792]: I0318 17:04:03.463897 4792 generic.go:334] "Generic (PLEG): container finished" podID="2da3fb70-6060-4637-b00a-3591615d46e9" containerID="749b2a8c6fa3bc6e71759415966dc1cc8a1e52930a5bdf2074dc19a90672dff3" exitCode=0 Mar 18 17:04:03 crc kubenswrapper[4792]: I0318 17:04:03.463988 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564224-jm7nf" event={"ID":"2da3fb70-6060-4637-b00a-3591615d46e9","Type":"ContainerDied","Data":"749b2a8c6fa3bc6e71759415966dc1cc8a1e52930a5bdf2074dc19a90672dff3"} Mar 18 17:04:04 crc kubenswrapper[4792]: I0318 17:04:04.936573 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-jm7nf" Mar 18 17:04:05 crc kubenswrapper[4792]: I0318 17:04:05.047837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkjh\" (UniqueName: \"kubernetes.io/projected/2da3fb70-6060-4637-b00a-3591615d46e9-kube-api-access-qgkjh\") pod \"2da3fb70-6060-4637-b00a-3591615d46e9\" (UID: \"2da3fb70-6060-4637-b00a-3591615d46e9\") " Mar 18 17:04:05 crc kubenswrapper[4792]: I0318 17:04:05.055302 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da3fb70-6060-4637-b00a-3591615d46e9-kube-api-access-qgkjh" (OuterVolumeSpecName: "kube-api-access-qgkjh") pod "2da3fb70-6060-4637-b00a-3591615d46e9" (UID: "2da3fb70-6060-4637-b00a-3591615d46e9"). InnerVolumeSpecName "kube-api-access-qgkjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:04:05 crc kubenswrapper[4792]: I0318 17:04:05.151496 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkjh\" (UniqueName: \"kubernetes.io/projected/2da3fb70-6060-4637-b00a-3591615d46e9-kube-api-access-qgkjh\") on node \"crc\" DevicePath \"\"" Mar 18 17:04:05 crc kubenswrapper[4792]: I0318 17:04:05.487161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564224-jm7nf" event={"ID":"2da3fb70-6060-4637-b00a-3591615d46e9","Type":"ContainerDied","Data":"4d40833ab93da8b330b3f3b26d92a3cc55d018c75b37beba749f58bc2139c04e"} Mar 18 17:04:05 crc kubenswrapper[4792]: I0318 17:04:05.487397 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d40833ab93da8b330b3f3b26d92a3cc55d018c75b37beba749f58bc2139c04e" Mar 18 17:04:05 crc kubenswrapper[4792]: I0318 17:04:05.487234 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564224-jm7nf" Mar 18 17:04:06 crc kubenswrapper[4792]: I0318 17:04:06.008730 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-8dbhb"] Mar 18 17:04:06 crc kubenswrapper[4792]: I0318 17:04:06.020587 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564218-8dbhb"] Mar 18 17:04:07 crc kubenswrapper[4792]: I0318 17:04:07.866470 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705b0029-d670-4d05-a5bb-bf897b84882c" path="/var/lib/kubelet/pods/705b0029-d670-4d05-a5bb-bf897b84882c/volumes" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.495032 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gr92q"] Mar 18 17:04:43 crc kubenswrapper[4792]: E0318 17:04:43.497317 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da3fb70-6060-4637-b00a-3591615d46e9" containerName="oc" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.497405 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da3fb70-6060-4637-b00a-3591615d46e9" containerName="oc" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.497748 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da3fb70-6060-4637-b00a-3591615d46e9" containerName="oc" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.499834 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.509689 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr92q"] Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.631134 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-catalog-content\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.631377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl57d\" (UniqueName: \"kubernetes.io/projected/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-kube-api-access-zl57d\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.631414 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-utilities\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.733664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl57d\" (UniqueName: \"kubernetes.io/projected/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-kube-api-access-zl57d\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.733715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-utilities\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.733791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-catalog-content\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.734301 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-utilities\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.734330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-catalog-content\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.756279 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl57d\" (UniqueName: \"kubernetes.io/projected/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-kube-api-access-zl57d\") pod \"redhat-marketplace-gr92q\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:43 crc kubenswrapper[4792]: I0318 17:04:43.828852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:44 crc kubenswrapper[4792]: I0318 17:04:44.383942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr92q"] Mar 18 17:04:44 crc kubenswrapper[4792]: I0318 17:04:44.920440 4792 generic.go:334] "Generic (PLEG): container finished" podID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerID="2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced" exitCode=0 Mar 18 17:04:44 crc kubenswrapper[4792]: I0318 17:04:44.920504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr92q" event={"ID":"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf","Type":"ContainerDied","Data":"2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced"} Mar 18 17:04:44 crc kubenswrapper[4792]: I0318 17:04:44.920547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr92q" event={"ID":"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf","Type":"ContainerStarted","Data":"7d9b92ab95517f4799c36f50b2948e9c6b383ef90b12ec60617897066aaf69eb"} Mar 18 17:04:44 crc kubenswrapper[4792]: I0318 17:04:44.924067 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:04:45 crc kubenswrapper[4792]: I0318 17:04:45.935718 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr92q" event={"ID":"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf","Type":"ContainerStarted","Data":"cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd"} Mar 18 17:04:47 crc kubenswrapper[4792]: I0318 17:04:47.955834 4792 generic.go:334] "Generic (PLEG): container finished" podID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerID="cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd" exitCode=0 Mar 18 17:04:47 crc kubenswrapper[4792]: I0318 17:04:47.956213 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr92q" event={"ID":"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf","Type":"ContainerDied","Data":"cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd"} Mar 18 17:04:48 crc kubenswrapper[4792]: I0318 17:04:48.989861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr92q" event={"ID":"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf","Type":"ContainerStarted","Data":"ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501"} Mar 18 17:04:49 crc kubenswrapper[4792]: I0318 17:04:49.017450 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gr92q" podStartSLOduration=2.57327407 podStartE2EDuration="6.017429569s" podCreationTimestamp="2026-03-18 17:04:43 +0000 UTC" firstStartedPulling="2026-03-18 17:04:44.923749818 +0000 UTC m=+5433.793078755" lastFinishedPulling="2026-03-18 17:04:48.367905317 +0000 UTC m=+5437.237234254" observedRunningTime="2026-03-18 17:04:49.007468405 +0000 UTC m=+5437.876797342" watchObservedRunningTime="2026-03-18 17:04:49.017429569 +0000 UTC m=+5437.886758506" Mar 18 17:04:50 crc kubenswrapper[4792]: I0318 17:04:50.440528 4792 trace.go:236] Trace[1481575751]: "Calculate volume metrics of storage for pod minio-dev/minio" (18-Mar-2026 17:04:49.412) (total time: 1028ms): Mar 18 17:04:50 crc kubenswrapper[4792]: Trace[1481575751]: [1.028486421s] [1.028486421s] END Mar 18 17:04:53 crc kubenswrapper[4792]: I0318 17:04:53.829401 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:53 crc kubenswrapper[4792]: I0318 17:04:53.830011 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:04:54 crc kubenswrapper[4792]: I0318 17:04:54.888695 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-gr92q" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="registry-server" probeResult="failure" output=< Mar 18 17:04:54 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:04:54 crc kubenswrapper[4792]: > Mar 18 17:04:59 crc kubenswrapper[4792]: I0318 17:04:59.232386 4792 scope.go:117] "RemoveContainer" containerID="b2b59e9f065ce04ec23aed25f0f72dfe105cc737f741ac62679172c716d95175" Mar 18 17:05:03 crc kubenswrapper[4792]: I0318 17:05:03.884742 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:05:03 crc kubenswrapper[4792]: I0318 17:05:03.949925 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:05:04 crc kubenswrapper[4792]: I0318 17:05:04.126354 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr92q"] Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.165139 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gr92q" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="registry-server" containerID="cri-o://ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501" gracePeriod=2 Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.716060 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.802128 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-catalog-content\") pod \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.802236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-utilities\") pod \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.802489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl57d\" (UniqueName: \"kubernetes.io/projected/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-kube-api-access-zl57d\") pod \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\" (UID: \"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf\") " Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.803080 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-utilities" (OuterVolumeSpecName: "utilities") pod "afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" (UID: "afb0c49d-56c6-489f-b6ba-dd2fb0f572bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.803392 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.815411 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-kube-api-access-zl57d" (OuterVolumeSpecName: "kube-api-access-zl57d") pod "afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" (UID: "afb0c49d-56c6-489f-b6ba-dd2fb0f572bf"). InnerVolumeSpecName "kube-api-access-zl57d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.826579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" (UID: "afb0c49d-56c6-489f-b6ba-dd2fb0f572bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.906161 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl57d\" (UniqueName: \"kubernetes.io/projected/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-kube-api-access-zl57d\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:05 crc kubenswrapper[4792]: I0318 17:05:05.906203 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.179327 4792 generic.go:334] "Generic (PLEG): container finished" podID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerID="ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501" exitCode=0 Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.179379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr92q" event={"ID":"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf","Type":"ContainerDied","Data":"ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501"} Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.179415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr92q" event={"ID":"afb0c49d-56c6-489f-b6ba-dd2fb0f572bf","Type":"ContainerDied","Data":"7d9b92ab95517f4799c36f50b2948e9c6b383ef90b12ec60617897066aaf69eb"} Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.179435 4792 scope.go:117] "RemoveContainer" containerID="ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.179492 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr92q" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.209399 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr92q"] Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.209503 4792 scope.go:117] "RemoveContainer" containerID="cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.222543 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr92q"] Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.234405 4792 scope.go:117] "RemoveContainer" containerID="2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.292753 4792 scope.go:117] "RemoveContainer" containerID="ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501" Mar 18 17:05:06 crc kubenswrapper[4792]: E0318 17:05:06.295624 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501\": container with ID starting with ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501 not found: ID does not exist" containerID="ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.295666 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501"} err="failed to get container status \"ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501\": rpc error: code = NotFound desc = could not find container \"ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501\": container with ID starting with ffbd89c98669c766a68f467aef4ab8ec14863e684b14004daf5db06b4d6b3501 not found: ID does not exist" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.295690 4792 scope.go:117] "RemoveContainer" containerID="cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd" Mar 18 17:05:06 crc kubenswrapper[4792]: E0318 17:05:06.296063 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd\": container with ID starting with cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd not found: ID does not exist" containerID="cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.296088 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd"} err="failed to get container status \"cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd\": rpc error: code = NotFound desc = could not find container \"cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd\": container with ID starting with cde2ed976eb9553d06dbcc8b8fe6f5c4ca25508f6c9e3eb859e38f324f81f3dd not found: ID does not exist" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.296100 4792 scope.go:117] "RemoveContainer" containerID="2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced" Mar 18 17:05:06 crc kubenswrapper[4792]: E0318 17:05:06.296570 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced\": container with ID starting with 2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced not found: ID does not exist" containerID="2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced" Mar 18 17:05:06 crc kubenswrapper[4792]: I0318 17:05:06.296617 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced"} err="failed to get container status \"2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced\": rpc error: code = NotFound desc = could not find container \"2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced\": container with ID starting with 2e12934199d2c02a0d68f4f4e31f2b5f272e1a0b0dd54a1cb2ade6b9c7613ced not found: ID does not exist" Mar 18 17:05:07 crc kubenswrapper[4792]: I0318 17:05:07.872629 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" path="/var/lib/kubelet/pods/afb0c49d-56c6-489f-b6ba-dd2fb0f572bf/volumes" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.295183 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xbb7z"] Mar 18 17:05:18 crc kubenswrapper[4792]: E0318 17:05:18.296582 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="extract-content" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.296602 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="extract-content" Mar 18 17:05:18 crc kubenswrapper[4792]: E0318 17:05:18.296650 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="registry-server" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.296658 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="registry-server" Mar 18 17:05:18 crc kubenswrapper[4792]: E0318 17:05:18.296702 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="extract-utilities" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.296713 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="extract-utilities" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.297010 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb0c49d-56c6-489f-b6ba-dd2fb0f572bf" containerName="registry-server" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.299391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.310305 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbb7z"] Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.456634 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbfv\" (UniqueName: \"kubernetes.io/projected/20edee68-6dc9-4204-b243-74f83dcff772-kube-api-access-7lbfv\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.456709 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-utilities\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.456735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-catalog-content\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.559420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbfv\" (UniqueName: \"kubernetes.io/projected/20edee68-6dc9-4204-b243-74f83dcff772-kube-api-access-7lbfv\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.559490 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-utilities\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.559527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-catalog-content\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.560134 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-utilities\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.560569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-catalog-content\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.581453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbfv\" (UniqueName: \"kubernetes.io/projected/20edee68-6dc9-4204-b243-74f83dcff772-kube-api-access-7lbfv\") pod \"redhat-operators-xbb7z\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:18 crc kubenswrapper[4792]: I0318 17:05:18.624764 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:19 crc kubenswrapper[4792]: I0318 17:05:19.113075 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbb7z"] Mar 18 17:05:19 crc kubenswrapper[4792]: I0318 17:05:19.327061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbb7z" event={"ID":"20edee68-6dc9-4204-b243-74f83dcff772","Type":"ContainerStarted","Data":"1031064652e50a5338f829ca40b06be3b385c3c026167e3340ca4e69823c5b67"} Mar 18 17:05:20 crc kubenswrapper[4792]: I0318 17:05:20.343140 4792 generic.go:334] "Generic (PLEG): container finished" podID="20edee68-6dc9-4204-b243-74f83dcff772" containerID="3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370" exitCode=0 Mar 18 17:05:20 crc kubenswrapper[4792]: I0318 17:05:20.343222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbb7z" event={"ID":"20edee68-6dc9-4204-b243-74f83dcff772","Type":"ContainerDied","Data":"3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370"} Mar 18 17:05:22 crc kubenswrapper[4792]: I0318 17:05:22.366649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbb7z" event={"ID":"20edee68-6dc9-4204-b243-74f83dcff772","Type":"ContainerStarted","Data":"9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392"} Mar 18 17:05:27 crc kubenswrapper[4792]: I0318 17:05:27.421743 4792 generic.go:334] "Generic (PLEG): container finished" podID="20edee68-6dc9-4204-b243-74f83dcff772" containerID="9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392" exitCode=0 Mar 18 17:05:27 crc kubenswrapper[4792]: I0318 17:05:27.421845 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbb7z" event={"ID":"20edee68-6dc9-4204-b243-74f83dcff772","Type":"ContainerDied","Data":"9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392"} Mar 18 17:05:28 crc kubenswrapper[4792]: I0318 17:05:28.434469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbb7z" event={"ID":"20edee68-6dc9-4204-b243-74f83dcff772","Type":"ContainerStarted","Data":"f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7"} Mar 18 17:05:28 crc kubenswrapper[4792]: I0318 17:05:28.457436 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xbb7z" podStartSLOduration=2.736453434 podStartE2EDuration="10.457411895s" podCreationTimestamp="2026-03-18 17:05:18 +0000 UTC" firstStartedPulling="2026-03-18 17:05:20.346774807 +0000 UTC m=+5469.216103754" lastFinishedPulling="2026-03-18 17:05:28.067733278 +0000 UTC m=+5476.937062215" observedRunningTime="2026-03-18 17:05:28.451519459 +0000 UTC m=+5477.320848406" watchObservedRunningTime="2026-03-18 17:05:28.457411895 +0000 UTC m=+5477.326740822" Mar 18 17:05:28 crc kubenswrapper[4792]: I0318 17:05:28.625454 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:28 crc kubenswrapper[4792]: I0318 17:05:28.625761 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:05:29 crc kubenswrapper[4792]: I0318 17:05:29.676149 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xbb7z" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" probeResult="failure" output=< Mar 18 17:05:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:05:29 crc kubenswrapper[4792]: > Mar 18 17:05:39 crc kubenswrapper[4792]: I0318 17:05:39.674815 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xbb7z" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" probeResult="failure" output=< Mar 18 17:05:39 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:05:39 crc kubenswrapper[4792]: > Mar 18 17:05:49 crc kubenswrapper[4792]: I0318 17:05:49.975258 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xbb7z" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" probeResult="failure" output=< Mar 18 17:05:49 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:05:49 crc kubenswrapper[4792]: > Mar 18 17:05:59 crc kubenswrapper[4792]: I0318 17:05:59.879569 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xbb7z" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" probeResult="failure" output=< Mar 18 17:05:59 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:05:59 crc kubenswrapper[4792]: > Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.154107 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564226-6cqjd"] Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.157320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.159232 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.163161 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.163639 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.169473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-6cqjd"] Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.219873 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfpz\" (UniqueName: \"kubernetes.io/projected/ee3de2dc-8a56-4b69-89ab-891290b2d254-kube-api-access-ljfpz\") pod \"auto-csr-approver-29564226-6cqjd\" (UID: \"ee3de2dc-8a56-4b69-89ab-891290b2d254\") " pod="openshift-infra/auto-csr-approver-29564226-6cqjd" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.321405 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.321469 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.322690 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfpz\" (UniqueName: \"kubernetes.io/projected/ee3de2dc-8a56-4b69-89ab-891290b2d254-kube-api-access-ljfpz\") pod \"auto-csr-approver-29564226-6cqjd\" (UID: \"ee3de2dc-8a56-4b69-89ab-891290b2d254\") " pod="openshift-infra/auto-csr-approver-29564226-6cqjd" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.344776 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfpz\" (UniqueName: \"kubernetes.io/projected/ee3de2dc-8a56-4b69-89ab-891290b2d254-kube-api-access-ljfpz\") pod \"auto-csr-approver-29564226-6cqjd\" (UID: \"ee3de2dc-8a56-4b69-89ab-891290b2d254\") " pod="openshift-infra/auto-csr-approver-29564226-6cqjd" Mar 18 17:06:00 crc kubenswrapper[4792]: I0318 17:06:00.488320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" Mar 18 17:06:01 crc kubenswrapper[4792]: I0318 17:06:01.239400 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-6cqjd"] Mar 18 17:06:01 crc kubenswrapper[4792]: I0318 17:06:01.875193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" event={"ID":"ee3de2dc-8a56-4b69-89ab-891290b2d254","Type":"ContainerStarted","Data":"288fe3365b441f2d3d5463f1d0bdad32e6d382a944bc0b9677e694d618f34fb0"} Mar 18 17:06:03 crc kubenswrapper[4792]: I0318 17:06:03.900459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" event={"ID":"ee3de2dc-8a56-4b69-89ab-891290b2d254","Type":"ContainerStarted","Data":"868ad13e9f2c8f0be2b901432b09042159b31125d17674cdc1bc8ae2beb909a3"} Mar 18 17:06:03 crc kubenswrapper[4792]: I0318 17:06:03.915600 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" podStartSLOduration=2.009563819 podStartE2EDuration="3.915582513s" podCreationTimestamp="2026-03-18 17:06:00 +0000 UTC" firstStartedPulling="2026-03-18 17:06:01.239745998 +0000 UTC m=+5510.109074935" lastFinishedPulling="2026-03-18 17:06:03.145764692 +0000 UTC m=+5512.015093629" observedRunningTime="2026-03-18 17:06:03.913022812 +0000 UTC m=+5512.782351749" watchObservedRunningTime="2026-03-18 17:06:03.915582513 +0000 UTC m=+5512.784911450" Mar 18 17:06:05 crc kubenswrapper[4792]: I0318 17:06:05.922230 4792 generic.go:334] "Generic (PLEG): container finished" podID="ee3de2dc-8a56-4b69-89ab-891290b2d254" containerID="868ad13e9f2c8f0be2b901432b09042159b31125d17674cdc1bc8ae2beb909a3" exitCode=0 Mar 18 17:06:05 crc kubenswrapper[4792]: I0318 17:06:05.922312 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" event={"ID":"ee3de2dc-8a56-4b69-89ab-891290b2d254","Type":"ContainerDied","Data":"868ad13e9f2c8f0be2b901432b09042159b31125d17674cdc1bc8ae2beb909a3"} Mar 18 17:06:07 crc kubenswrapper[4792]: I0318 17:06:07.584311 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" Mar 18 17:06:07 crc kubenswrapper[4792]: I0318 17:06:07.628076 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljfpz\" (UniqueName: \"kubernetes.io/projected/ee3de2dc-8a56-4b69-89ab-891290b2d254-kube-api-access-ljfpz\") pod \"ee3de2dc-8a56-4b69-89ab-891290b2d254\" (UID: \"ee3de2dc-8a56-4b69-89ab-891290b2d254\") " Mar 18 17:06:07 crc kubenswrapper[4792]: I0318 17:06:07.637933 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3de2dc-8a56-4b69-89ab-891290b2d254-kube-api-access-ljfpz" (OuterVolumeSpecName: "kube-api-access-ljfpz") pod "ee3de2dc-8a56-4b69-89ab-891290b2d254" (UID: "ee3de2dc-8a56-4b69-89ab-891290b2d254"). InnerVolumeSpecName "kube-api-access-ljfpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:07 crc kubenswrapper[4792]: I0318 17:06:07.731867 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljfpz\" (UniqueName: \"kubernetes.io/projected/ee3de2dc-8a56-4b69-89ab-891290b2d254-kube-api-access-ljfpz\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:07 crc kubenswrapper[4792]: I0318 17:06:07.950427 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" event={"ID":"ee3de2dc-8a56-4b69-89ab-891290b2d254","Type":"ContainerDied","Data":"288fe3365b441f2d3d5463f1d0bdad32e6d382a944bc0b9677e694d618f34fb0"} Mar 18 17:06:07 crc kubenswrapper[4792]: I0318 17:06:07.950469 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288fe3365b441f2d3d5463f1d0bdad32e6d382a944bc0b9677e694d618f34fb0" Mar 18 17:06:07 crc kubenswrapper[4792]: I0318 17:06:07.950472 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564226-6cqjd" Mar 18 17:06:08 crc kubenswrapper[4792]: I0318 17:06:08.016743 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-gwnn4"] Mar 18 17:06:08 crc kubenswrapper[4792]: I0318 17:06:08.029176 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564220-gwnn4"] Mar 18 17:06:08 crc kubenswrapper[4792]: I0318 17:06:08.680727 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:06:08 crc kubenswrapper[4792]: I0318 17:06:08.733774 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:06:08 crc kubenswrapper[4792]: I0318 17:06:08.920239 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbb7z"] Mar 18 17:06:09 crc kubenswrapper[4792]: I0318 17:06:09.876675 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f23771-7d72-4164-936b-e1b35d09bda4" path="/var/lib/kubelet/pods/98f23771-7d72-4164-936b-e1b35d09bda4/volumes" Mar 18 17:06:09 crc kubenswrapper[4792]: I0318 17:06:09.971657 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xbb7z" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" containerID="cri-o://f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7" gracePeriod=2 Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.500565 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.602336 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-utilities\") pod \"20edee68-6dc9-4204-b243-74f83dcff772\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.602677 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-catalog-content\") pod \"20edee68-6dc9-4204-b243-74f83dcff772\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.602755 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lbfv\" (UniqueName: \"kubernetes.io/projected/20edee68-6dc9-4204-b243-74f83dcff772-kube-api-access-7lbfv\") pod \"20edee68-6dc9-4204-b243-74f83dcff772\" (UID: \"20edee68-6dc9-4204-b243-74f83dcff772\") " Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.604524 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-utilities" (OuterVolumeSpecName: "utilities") pod "20edee68-6dc9-4204-b243-74f83dcff772" (UID: "20edee68-6dc9-4204-b243-74f83dcff772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.616282 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20edee68-6dc9-4204-b243-74f83dcff772-kube-api-access-7lbfv" (OuterVolumeSpecName: "kube-api-access-7lbfv") pod "20edee68-6dc9-4204-b243-74f83dcff772" (UID: "20edee68-6dc9-4204-b243-74f83dcff772"). InnerVolumeSpecName "kube-api-access-7lbfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.705710 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.706053 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lbfv\" (UniqueName: \"kubernetes.io/projected/20edee68-6dc9-4204-b243-74f83dcff772-kube-api-access-7lbfv\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.768064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20edee68-6dc9-4204-b243-74f83dcff772" (UID: "20edee68-6dc9-4204-b243-74f83dcff772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.808436 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20edee68-6dc9-4204-b243-74f83dcff772-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.984196 4792 generic.go:334] "Generic (PLEG): container finished" podID="20edee68-6dc9-4204-b243-74f83dcff772" containerID="f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7" exitCode=0 Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.984250 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbb7z" event={"ID":"20edee68-6dc9-4204-b243-74f83dcff772","Type":"ContainerDied","Data":"f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7"} Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.984286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbb7z" event={"ID":"20edee68-6dc9-4204-b243-74f83dcff772","Type":"ContainerDied","Data":"1031064652e50a5338f829ca40b06be3b385c3c026167e3340ca4e69823c5b67"} Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.984319 4792 scope.go:117] "RemoveContainer" containerID="f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7" Mar 18 17:06:10 crc kubenswrapper[4792]: I0318 17:06:10.984373 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbb7z" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.019294 4792 scope.go:117] "RemoveContainer" containerID="9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.036823 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbb7z"] Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.048514 4792 scope.go:117] "RemoveContainer" containerID="3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.049213 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xbb7z"] Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.106723 4792 scope.go:117] "RemoveContainer" containerID="f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7" Mar 18 17:06:11 crc kubenswrapper[4792]: E0318 17:06:11.107512 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7\": container with ID starting with f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7 not found: ID does not exist" containerID="f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.107571 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7"} err="failed to get container status \"f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7\": rpc error: code = NotFound desc = could not find container \"f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7\": container with ID starting with f9910d801ae00500a2421d579b0126cd696f61921acc1c8179b2d2502d8162d7 not found: ID does not exist" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.107611 4792 scope.go:117] "RemoveContainer" containerID="9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392" Mar 18 17:06:11 crc kubenswrapper[4792]: E0318 17:06:11.108010 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392\": container with ID starting with 9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392 not found: ID does not exist" containerID="9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.108034 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392"} err="failed to get container status \"9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392\": rpc error: code = NotFound desc = could not find container \"9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392\": container with ID starting with 9644d61e4d211b7a866bc5498923aba05311fd892db253b09aba776993740392 not found: ID does not exist" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.108049 4792 scope.go:117] "RemoveContainer" containerID="3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370" Mar 18 17:06:11 crc kubenswrapper[4792]: E0318 17:06:11.108651 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370\": container with ID starting with 3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370 not found: ID does not exist" containerID="3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.108695 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370"} err="failed to get container status \"3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370\": rpc error: code = NotFound desc = could not find container \"3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370\": container with ID starting with 3bdb5610b563674b454414ace33dbc8d4b5a653cab3926fc81e5618da6fa7370 not found: ID does not exist" Mar 18 17:06:11 crc kubenswrapper[4792]: I0318 17:06:11.874939 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20edee68-6dc9-4204-b243-74f83dcff772" path="/var/lib/kubelet/pods/20edee68-6dc9-4204-b243-74f83dcff772/volumes" Mar 18 17:06:30 crc kubenswrapper[4792]: I0318 17:06:30.322429 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:06:30 crc kubenswrapper[4792]: I0318 17:06:30.324246 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:06:59 crc kubenswrapper[4792]: I0318 17:06:59.400647 4792 scope.go:117] "RemoveContainer" containerID="b635e8276e1a5a1637bf307d03106e8e8a81579a01993f1d69b692c083448d3f" Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.321496 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.321870 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.321934 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.323390 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35126d0f2b2005c3e3c76c00acfb58aebc53a6fafa2e1c9419db524bb039de0a"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.323475 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://35126d0f2b2005c3e3c76c00acfb58aebc53a6fafa2e1c9419db524bb039de0a" gracePeriod=600 Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.535086 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="35126d0f2b2005c3e3c76c00acfb58aebc53a6fafa2e1c9419db524bb039de0a" exitCode=0 Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.535150 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"35126d0f2b2005c3e3c76c00acfb58aebc53a6fafa2e1c9419db524bb039de0a"} Mar 18 17:07:00 crc kubenswrapper[4792]: I0318 17:07:00.535196 4792 scope.go:117] "RemoveContainer" containerID="5c8889c4df3988c253f1673be3bca8dbc3a72eca5f207620904425eb0c9bc3dd" Mar 18 17:07:01 crc kubenswrapper[4792]: I0318 17:07:01.549204 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8"} Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.277580 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4rhnm"] Mar 18 17:07:18 crc kubenswrapper[4792]: E0318 17:07:18.279652 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="extract-content" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.279756 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="extract-content" Mar 18 17:07:18 crc kubenswrapper[4792]: E0318 17:07:18.279826 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3de2dc-8a56-4b69-89ab-891290b2d254" containerName="oc" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.279883 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3de2dc-8a56-4b69-89ab-891290b2d254" containerName="oc" Mar 18 17:07:18 crc kubenswrapper[4792]: E0318 17:07:18.279954 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.280082 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" Mar 18 17:07:18 crc kubenswrapper[4792]: E0318 17:07:18.280176 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="extract-utilities" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.280272 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="extract-utilities" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.280685 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="20edee68-6dc9-4204-b243-74f83dcff772" containerName="registry-server" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.280813 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3de2dc-8a56-4b69-89ab-891290b2d254" containerName="oc" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.282656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.305093 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rhnm"] Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.399704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9drk\" (UniqueName: \"kubernetes.io/projected/08f43e4a-df13-418c-b76d-60e541a70f25-kube-api-access-j9drk\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.400255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-catalog-content\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.400320 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-utilities\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.502233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9drk\" (UniqueName: \"kubernetes.io/projected/08f43e4a-df13-418c-b76d-60e541a70f25-kube-api-access-j9drk\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.502467 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-catalog-content\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.502493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-utilities\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.502947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-catalog-content\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.503164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-utilities\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:18 crc kubenswrapper[4792]: I0318 17:07:18.932262 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9drk\" (UniqueName: \"kubernetes.io/projected/08f43e4a-df13-418c-b76d-60e541a70f25-kube-api-access-j9drk\") pod \"community-operators-4rhnm\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:19 crc kubenswrapper[4792]: I0318 17:07:19.218675 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:19 crc kubenswrapper[4792]: I0318 17:07:19.718652 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rhnm"] Mar 18 17:07:19 crc kubenswrapper[4792]: I0318 17:07:19.804292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rhnm" event={"ID":"08f43e4a-df13-418c-b76d-60e541a70f25","Type":"ContainerStarted","Data":"8b16764e3312c27954082649623381108f91935d700b97be2531d3e4e55720ab"} Mar 18 17:07:20 crc kubenswrapper[4792]: I0318 17:07:20.818708 4792 generic.go:334] "Generic (PLEG): container finished" podID="08f43e4a-df13-418c-b76d-60e541a70f25" containerID="b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532" exitCode=0 Mar 18 17:07:20 crc kubenswrapper[4792]: I0318 17:07:20.818785 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rhnm" event={"ID":"08f43e4a-df13-418c-b76d-60e541a70f25","Type":"ContainerDied","Data":"b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532"} Mar 18 17:07:21 crc kubenswrapper[4792]: I0318 17:07:21.834405 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rhnm" event={"ID":"08f43e4a-df13-418c-b76d-60e541a70f25","Type":"ContainerStarted","Data":"71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f"} Mar 18 17:07:23 crc kubenswrapper[4792]: I0318 17:07:23.858727 4792 generic.go:334] "Generic (PLEG): container finished" podID="08f43e4a-df13-418c-b76d-60e541a70f25" containerID="71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f" exitCode=0 Mar 18 17:07:23 crc kubenswrapper[4792]: I0318 17:07:23.868284 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rhnm" event={"ID":"08f43e4a-df13-418c-b76d-60e541a70f25","Type":"ContainerDied","Data":"71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f"} Mar 18 17:07:24 crc kubenswrapper[4792]: I0318 17:07:24.876654 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rhnm" event={"ID":"08f43e4a-df13-418c-b76d-60e541a70f25","Type":"ContainerStarted","Data":"3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a"} Mar 18 17:07:24 crc kubenswrapper[4792]: I0318 17:07:24.906292 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4rhnm" podStartSLOduration=3.403367639 podStartE2EDuration="6.906276393s" podCreationTimestamp="2026-03-18 17:07:18 +0000 UTC" firstStartedPulling="2026-03-18 17:07:20.823266329 +0000 UTC m=+5589.692595266" lastFinishedPulling="2026-03-18 17:07:24.326175083 +0000 UTC m=+5593.195504020" observedRunningTime="2026-03-18 17:07:24.905624912 +0000 UTC m=+5593.774953859" watchObservedRunningTime="2026-03-18 17:07:24.906276393 +0000 UTC m=+5593.775605330" Mar 18 17:07:29 crc kubenswrapper[4792]: I0318 17:07:29.219598 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:29 crc kubenswrapper[4792]: I0318 17:07:29.220151 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:29 crc kubenswrapper[4792]: I0318 17:07:29.278725 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:30 crc kubenswrapper[4792]: I0318 17:07:30.587217 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:30 crc kubenswrapper[4792]: I0318 17:07:30.655342 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rhnm"] Mar 18 17:07:31 crc kubenswrapper[4792]: I0318 17:07:31.948854 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4rhnm" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="registry-server" containerID="cri-o://3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a" gracePeriod=2 Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.481590 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.656862 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9drk\" (UniqueName: \"kubernetes.io/projected/08f43e4a-df13-418c-b76d-60e541a70f25-kube-api-access-j9drk\") pod \"08f43e4a-df13-418c-b76d-60e541a70f25\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.656919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-catalog-content\") pod \"08f43e4a-df13-418c-b76d-60e541a70f25\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.657026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-utilities\") pod \"08f43e4a-df13-418c-b76d-60e541a70f25\" (UID: \"08f43e4a-df13-418c-b76d-60e541a70f25\") " Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.658159 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-utilities" (OuterVolumeSpecName: "utilities") pod "08f43e4a-df13-418c-b76d-60e541a70f25" (UID: "08f43e4a-df13-418c-b76d-60e541a70f25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.667031 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f43e4a-df13-418c-b76d-60e541a70f25-kube-api-access-j9drk" (OuterVolumeSpecName: "kube-api-access-j9drk") pod "08f43e4a-df13-418c-b76d-60e541a70f25" (UID: "08f43e4a-df13-418c-b76d-60e541a70f25"). InnerVolumeSpecName "kube-api-access-j9drk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.708098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08f43e4a-df13-418c-b76d-60e541a70f25" (UID: "08f43e4a-df13-418c-b76d-60e541a70f25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.760105 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9drk\" (UniqueName: \"kubernetes.io/projected/08f43e4a-df13-418c-b76d-60e541a70f25-kube-api-access-j9drk\") on node \"crc\" DevicePath \"\"" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.760446 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.760568 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f43e4a-df13-418c-b76d-60e541a70f25-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.960646 4792 generic.go:334] "Generic (PLEG): container finished" podID="08f43e4a-df13-418c-b76d-60e541a70f25" containerID="3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a" exitCode=0 Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.960688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rhnm" event={"ID":"08f43e4a-df13-418c-b76d-60e541a70f25","Type":"ContainerDied","Data":"3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a"} Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.960717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rhnm" event={"ID":"08f43e4a-df13-418c-b76d-60e541a70f25","Type":"ContainerDied","Data":"8b16764e3312c27954082649623381108f91935d700b97be2531d3e4e55720ab"} Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.960736 4792 scope.go:117] "RemoveContainer" containerID="3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.960754 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rhnm" Mar 18 17:07:32 crc kubenswrapper[4792]: I0318 17:07:32.991447 4792 scope.go:117] "RemoveContainer" containerID="71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.010268 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rhnm"] Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.024815 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4rhnm"] Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.045963 4792 scope.go:117] "RemoveContainer" containerID="b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.111799 4792 scope.go:117] "RemoveContainer" containerID="3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a" Mar 18 17:07:33 crc kubenswrapper[4792]: E0318 17:07:33.112241 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a\": container with ID starting with 3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a not found: ID does not exist" containerID="3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.112274 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a"} err="failed to get container status \"3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a\": rpc error: code = NotFound desc = could not find container \"3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a\": container with ID starting with 3627a249d82c817704164b3c5004dfe880d1d11f9d8a06e72390cde7f2eadf0a not found: ID does not exist" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.112296 4792 scope.go:117] "RemoveContainer" containerID="71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f" Mar 18 17:07:33 crc kubenswrapper[4792]: E0318 17:07:33.112771 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f\": container with ID starting with 71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f not found: ID does not exist" containerID="71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.112797 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f"} err="failed to get container status \"71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f\": rpc error: code = NotFound desc = could not find container \"71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f\": container with ID starting with 71786a2e42fded58bd53cabe84beb288923fb51ea7af4ef32102cdf91c5d997f not found: ID does not exist" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.112813 4792 scope.go:117] "RemoveContainer" containerID="b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532" Mar 18 17:07:33 crc kubenswrapper[4792]: E0318 17:07:33.113152 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532\": container with ID starting with b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532 not found: ID does not exist" containerID="b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.113173 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532"} err="failed to get container status \"b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532\": rpc error: code = NotFound desc = could not find container \"b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532\": container with ID starting with b80a1eb1145b8da800048b3c1023aae83381a3d5e4e79bb22feae39089485532 not found: ID does not exist" Mar 18 17:07:33 crc kubenswrapper[4792]: I0318 17:07:33.867822 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" path="/var/lib/kubelet/pods/08f43e4a-df13-418c-b76d-60e541a70f25/volumes" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.147177 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564228-qqbff"] Mar 18 17:08:00 crc kubenswrapper[4792]: E0318 17:08:00.148259 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="extract-utilities" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.148272 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="extract-utilities" Mar 18 17:08:00 crc kubenswrapper[4792]: E0318 17:08:00.148296 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="extract-content" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.148302 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="extract-content" Mar 18 17:08:00 crc kubenswrapper[4792]: E0318 17:08:00.148320 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="registry-server" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.148327 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="registry-server" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.148582 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f43e4a-df13-418c-b76d-60e541a70f25" containerName="registry-server" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.149630 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-qqbff" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.152102 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.152216 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.153240 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.165058 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-qqbff"] Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.229788 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdgc\" (UniqueName: \"kubernetes.io/projected/d3f43cc3-e797-4876-b8b4-2bb3a0d16df0-kube-api-access-7vdgc\") pod \"auto-csr-approver-29564228-qqbff\" (UID: \"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0\") " pod="openshift-infra/auto-csr-approver-29564228-qqbff" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.333076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdgc\" (UniqueName: \"kubernetes.io/projected/d3f43cc3-e797-4876-b8b4-2bb3a0d16df0-kube-api-access-7vdgc\") pod \"auto-csr-approver-29564228-qqbff\" (UID: \"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0\") " pod="openshift-infra/auto-csr-approver-29564228-qqbff" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.353127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdgc\" (UniqueName: \"kubernetes.io/projected/d3f43cc3-e797-4876-b8b4-2bb3a0d16df0-kube-api-access-7vdgc\") pod \"auto-csr-approver-29564228-qqbff\" (UID: \"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0\") " pod="openshift-infra/auto-csr-approver-29564228-qqbff" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.473721 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-qqbff" Mar 18 17:08:00 crc kubenswrapper[4792]: I0318 17:08:00.943479 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-qqbff"] Mar 18 17:08:01 crc kubenswrapper[4792]: I0318 17:08:01.288878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564228-qqbff" event={"ID":"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0","Type":"ContainerStarted","Data":"6620da5714c9ff7200feaf930e723c86f53256c157e9d23d807417df43e9634c"} Mar 18 17:08:04 crc kubenswrapper[4792]: I0318 17:08:04.322188 4792 generic.go:334] "Generic (PLEG): container finished" podID="d3f43cc3-e797-4876-b8b4-2bb3a0d16df0" containerID="0cd3debd5cd8c9fd84eeb59678554d92da41752100c8879013306eb772ada716" exitCode=0 Mar 18 17:08:04 crc kubenswrapper[4792]: I0318 17:08:04.322277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564228-qqbff" event={"ID":"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0","Type":"ContainerDied","Data":"0cd3debd5cd8c9fd84eeb59678554d92da41752100c8879013306eb772ada716"} Mar 18 17:08:05 crc kubenswrapper[4792]: I0318 17:08:05.910213 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-qqbff" Mar 18 17:08:05 crc kubenswrapper[4792]: I0318 17:08:05.970648 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vdgc\" (UniqueName: \"kubernetes.io/projected/d3f43cc3-e797-4876-b8b4-2bb3a0d16df0-kube-api-access-7vdgc\") pod \"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0\" (UID: \"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0\") " Mar 18 17:08:05 crc kubenswrapper[4792]: I0318 17:08:05.976207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f43cc3-e797-4876-b8b4-2bb3a0d16df0-kube-api-access-7vdgc" (OuterVolumeSpecName: "kube-api-access-7vdgc") pod "d3f43cc3-e797-4876-b8b4-2bb3a0d16df0" (UID: "d3f43cc3-e797-4876-b8b4-2bb3a0d16df0"). InnerVolumeSpecName "kube-api-access-7vdgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:08:06 crc kubenswrapper[4792]: I0318 17:08:06.074632 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vdgc\" (UniqueName: \"kubernetes.io/projected/d3f43cc3-e797-4876-b8b4-2bb3a0d16df0-kube-api-access-7vdgc\") on node \"crc\" DevicePath \"\"" Mar 18 17:08:06 crc kubenswrapper[4792]: I0318 17:08:06.349523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564228-qqbff" event={"ID":"d3f43cc3-e797-4876-b8b4-2bb3a0d16df0","Type":"ContainerDied","Data":"6620da5714c9ff7200feaf930e723c86f53256c157e9d23d807417df43e9634c"} Mar 18 17:08:06 crc kubenswrapper[4792]: I0318 17:08:06.349561 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6620da5714c9ff7200feaf930e723c86f53256c157e9d23d807417df43e9634c" Mar 18 17:08:06 crc kubenswrapper[4792]: I0318 17:08:06.349610 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564228-qqbff" Mar 18 17:08:06 crc kubenswrapper[4792]: I0318 17:08:06.984772 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-nshkt"] Mar 18 17:08:07 crc kubenswrapper[4792]: I0318 17:08:07.000326 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564222-nshkt"] Mar 18 17:08:07 crc kubenswrapper[4792]: I0318 17:08:07.871564 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8deb8186-c72b-4eef-8547-cbebc8cb0985" path="/var/lib/kubelet/pods/8deb8186-c72b-4eef-8547-cbebc8cb0985/volumes" Mar 18 17:08:59 crc kubenswrapper[4792]: I0318 17:08:59.587539 4792 scope.go:117] "RemoveContainer" containerID="879b3755e387efcba6ad0e2cf70bd9dceffed983ed73cd8d57ab2c32c69d8b92" Mar 18 17:09:00 crc kubenswrapper[4792]: I0318 17:09:00.321674 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:09:00 crc kubenswrapper[4792]: I0318 17:09:00.322076 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.225642 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q8qhs"] Mar 18 17:09:15 crc kubenswrapper[4792]: E0318 17:09:15.227812 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f43cc3-e797-4876-b8b4-2bb3a0d16df0" containerName="oc" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.227896 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f43cc3-e797-4876-b8b4-2bb3a0d16df0" containerName="oc" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.228352 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f43cc3-e797-4876-b8b4-2bb3a0d16df0" containerName="oc" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.230256 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.253789 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8qhs"] Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.281663 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-utilities\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.282082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-catalog-content\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.282409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq9rb\" (UniqueName: \"kubernetes.io/projected/268ec89f-288e-470c-a074-9b32f1f2a03f-kube-api-access-rq9rb\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.384844 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-catalog-content\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.385009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq9rb\" (UniqueName: \"kubernetes.io/projected/268ec89f-288e-470c-a074-9b32f1f2a03f-kube-api-access-rq9rb\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.385247 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-utilities\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.385901 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-utilities\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.386204 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-catalog-content\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.404286 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq9rb\" (UniqueName: \"kubernetes.io/projected/268ec89f-288e-470c-a074-9b32f1f2a03f-kube-api-access-rq9rb\") pod \"certified-operators-q8qhs\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:15 crc kubenswrapper[4792]: I0318 17:09:15.558943 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:16 crc kubenswrapper[4792]: I0318 17:09:16.078713 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8qhs"] Mar 18 17:09:16 crc kubenswrapper[4792]: I0318 17:09:16.102839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8qhs" event={"ID":"268ec89f-288e-470c-a074-9b32f1f2a03f","Type":"ContainerStarted","Data":"67ee061ca8f6494c686ce730826d58765f4fd1b0091613c258984bcac60edfe2"} Mar 18 17:09:17 crc kubenswrapper[4792]: I0318 17:09:17.116872 4792 generic.go:334] "Generic (PLEG): container finished" podID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerID="3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20" exitCode=0 Mar 18 17:09:17 crc kubenswrapper[4792]: I0318 17:09:17.117011 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8qhs" event={"ID":"268ec89f-288e-470c-a074-9b32f1f2a03f","Type":"ContainerDied","Data":"3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20"} Mar 18 17:09:19 crc kubenswrapper[4792]: I0318 17:09:19.141723 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8qhs" event={"ID":"268ec89f-288e-470c-a074-9b32f1f2a03f","Type":"ContainerStarted","Data":"cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05"} Mar 18 17:09:20 crc kubenswrapper[4792]: I0318 17:09:20.156467 4792 generic.go:334] "Generic (PLEG): container finished" podID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerID="cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05" exitCode=0 Mar 18 17:09:20 crc kubenswrapper[4792]: I0318 17:09:20.156506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8qhs" event={"ID":"268ec89f-288e-470c-a074-9b32f1f2a03f","Type":"ContainerDied","Data":"cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05"} Mar 18 17:09:21 crc kubenswrapper[4792]: I0318 17:09:21.170228 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8qhs" event={"ID":"268ec89f-288e-470c-a074-9b32f1f2a03f","Type":"ContainerStarted","Data":"5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb"} Mar 18 17:09:21 crc kubenswrapper[4792]: I0318 17:09:21.199586 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q8qhs" podStartSLOduration=2.49198598 podStartE2EDuration="6.199560157s" podCreationTimestamp="2026-03-18 17:09:15 +0000 UTC" firstStartedPulling="2026-03-18 17:09:17.119743014 +0000 UTC m=+5705.989071951" lastFinishedPulling="2026-03-18 17:09:20.827317181 +0000 UTC m=+5709.696646128" observedRunningTime="2026-03-18 17:09:21.186865716 +0000 UTC m=+5710.056194663" watchObservedRunningTime="2026-03-18 17:09:21.199560157 +0000 UTC m=+5710.068889094" Mar 18 17:09:25 crc kubenswrapper[4792]: I0318 17:09:25.559825 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:25 crc kubenswrapper[4792]: I0318 17:09:25.560456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:26 crc kubenswrapper[4792]: I0318 17:09:26.624578 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-q8qhs" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="registry-server" probeResult="failure" output=< Mar 18 17:09:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:09:26 crc kubenswrapper[4792]: > Mar 18 17:09:30 crc kubenswrapper[4792]: I0318 17:09:30.322456 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:09:30 crc kubenswrapper[4792]: I0318 17:09:30.323022 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:09:35 crc kubenswrapper[4792]: I0318 17:09:35.623663 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:35 crc kubenswrapper[4792]: I0318 17:09:35.675220 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:35 crc kubenswrapper[4792]: I0318 17:09:35.866162 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q8qhs"] Mar 18 17:09:37 crc kubenswrapper[4792]: I0318 17:09:37.352195 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q8qhs" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="registry-server" containerID="cri-o://5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb" gracePeriod=2 Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.061358 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.158642 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq9rb\" (UniqueName: \"kubernetes.io/projected/268ec89f-288e-470c-a074-9b32f1f2a03f-kube-api-access-rq9rb\") pod \"268ec89f-288e-470c-a074-9b32f1f2a03f\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.158713 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-catalog-content\") pod \"268ec89f-288e-470c-a074-9b32f1f2a03f\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.158896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-utilities\") pod \"268ec89f-288e-470c-a074-9b32f1f2a03f\" (UID: \"268ec89f-288e-470c-a074-9b32f1f2a03f\") " Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.159777 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-utilities" (OuterVolumeSpecName: "utilities") pod "268ec89f-288e-470c-a074-9b32f1f2a03f" (UID: "268ec89f-288e-470c-a074-9b32f1f2a03f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.164800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268ec89f-288e-470c-a074-9b32f1f2a03f-kube-api-access-rq9rb" (OuterVolumeSpecName: "kube-api-access-rq9rb") pod "268ec89f-288e-470c-a074-9b32f1f2a03f" (UID: "268ec89f-288e-470c-a074-9b32f1f2a03f"). InnerVolumeSpecName "kube-api-access-rq9rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.212585 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "268ec89f-288e-470c-a074-9b32f1f2a03f" (UID: "268ec89f-288e-470c-a074-9b32f1f2a03f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.261544 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.261578 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq9rb\" (UniqueName: \"kubernetes.io/projected/268ec89f-288e-470c-a074-9b32f1f2a03f-kube-api-access-rq9rb\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.261590 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/268ec89f-288e-470c-a074-9b32f1f2a03f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.364256 4792 generic.go:334] "Generic (PLEG): container finished" podID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerID="5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb" exitCode=0 Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.364311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8qhs" event={"ID":"268ec89f-288e-470c-a074-9b32f1f2a03f","Type":"ContainerDied","Data":"5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb"} Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.364328 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8qhs" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.364346 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8qhs" event={"ID":"268ec89f-288e-470c-a074-9b32f1f2a03f","Type":"ContainerDied","Data":"67ee061ca8f6494c686ce730826d58765f4fd1b0091613c258984bcac60edfe2"} Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.364368 4792 scope.go:117] "RemoveContainer" containerID="5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.393792 4792 scope.go:117] "RemoveContainer" containerID="cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.401342 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q8qhs"] Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.412072 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q8qhs"] Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.427941 4792 scope.go:117] "RemoveContainer" containerID="3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.478930 4792 scope.go:117] "RemoveContainer" containerID="5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb" Mar 18 17:09:38 crc kubenswrapper[4792]: E0318 17:09:38.479403 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb\": container with ID starting with 5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb not found: ID does not exist" containerID="5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.479435 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb"} err="failed to get container status \"5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb\": rpc error: code = NotFound desc = could not find container \"5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb\": container with ID starting with 5ae5172971676e8bb513f275b009ae075d6a5b01714d62876bdc01d11ef709eb not found: ID does not exist" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.479455 4792 scope.go:117] "RemoveContainer" containerID="cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05" Mar 18 17:09:38 crc kubenswrapper[4792]: E0318 17:09:38.479789 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05\": container with ID starting with cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05 not found: ID does not exist" containerID="cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.479811 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05"} err="failed to get container status \"cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05\": rpc error: code = NotFound desc = could not find container \"cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05\": container with ID starting with cf9b4f0e8c64e6f59b5b2d8fe702186a3adb92c8c4a9a43a470f0eab2e98ed05 not found: ID does not exist" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.479825 4792 scope.go:117] "RemoveContainer" containerID="3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20" Mar 18 17:09:38 crc kubenswrapper[4792]: E0318 17:09:38.480084 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20\": container with ID starting with 3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20 not found: ID does not exist" containerID="3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20" Mar 18 17:09:38 crc kubenswrapper[4792]: I0318 17:09:38.480103 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20"} err="failed to get container status \"3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20\": rpc error: code = NotFound desc = could not find container \"3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20\": container with ID starting with 3da966d8c6796788bfb27ca210834bba93bb3087bc4238db8346a76c39fb4f20 not found: ID does not exist" Mar 18 17:09:39 crc kubenswrapper[4792]: I0318 17:09:39.870902 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" path="/var/lib/kubelet/pods/268ec89f-288e-470c-a074-9b32f1f2a03f/volumes" Mar 18 17:09:48 crc kubenswrapper[4792]: I0318 17:09:48.473693 4792 generic.go:334] "Generic (PLEG): container finished" podID="11070522-2520-4564-a02c-3bd460ae33fe" containerID="54e2c32417ddc412685148278d33c076ee0cb52cf58f22f52650de7e4e9a6cc8" exitCode=1 Mar 18 17:09:48 crc kubenswrapper[4792]: I0318 17:09:48.473785 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"11070522-2520-4564-a02c-3bd460ae33fe","Type":"ContainerDied","Data":"54e2c32417ddc412685148278d33c076ee0cb52cf58f22f52650de7e4e9a6cc8"} Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.911029 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.942466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ssh-key\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.942767 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-workdir\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.942832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq7z2\" (UniqueName: \"kubernetes.io/projected/11070522-2520-4564-a02c-3bd460ae33fe-kube-api-access-qq7z2\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.942855 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-config-data\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.942902 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.942996 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.943030 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-temporary\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.943275 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ca-certs\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.943333 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config-secret\") pod \"11070522-2520-4564-a02c-3bd460ae33fe\" (UID: \"11070522-2520-4564-a02c-3bd460ae33fe\") " Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.944298 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.944443 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-config-data" (OuterVolumeSpecName: "config-data") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.955723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.957689 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.957724 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/11070522-2520-4564-a02c-3bd460ae33fe-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.957739 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.961709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11070522-2520-4564-a02c-3bd460ae33fe-kube-api-access-qq7z2" (OuterVolumeSpecName: "kube-api-access-qq7z2") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "kube-api-access-qq7z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:09:49 crc kubenswrapper[4792]: I0318 17:09:49.963396 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.002817 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.012910 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.013352 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.042113 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "11070522-2520-4564-a02c-3bd460ae33fe" (UID: "11070522-2520-4564-a02c-3bd460ae33fe"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.061167 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.061219 4792 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.061234 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.061250 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11070522-2520-4564-a02c-3bd460ae33fe-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.061259 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq7z2\" (UniqueName: \"kubernetes.io/projected/11070522-2520-4564-a02c-3bd460ae33fe-kube-api-access-qq7z2\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.061270 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11070522-2520-4564-a02c-3bd460ae33fe-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.089382 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.164687 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.504666 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"11070522-2520-4564-a02c-3bd460ae33fe","Type":"ContainerDied","Data":"7c85f5866742680724d018262bbaf326c49125ccd67c4019f1f26979b44ea100"} Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.504719 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c85f5866742680724d018262bbaf326c49125ccd67c4019f1f26979b44ea100" Mar 18 17:09:50 crc kubenswrapper[4792]: I0318 17:09:50.504816 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.251825 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 17:09:59 crc kubenswrapper[4792]: E0318 17:09:59.253583 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11070522-2520-4564-a02c-3bd460ae33fe" containerName="tempest-tests-tempest-tests-runner" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.253609 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="11070522-2520-4564-a02c-3bd460ae33fe" containerName="tempest-tests-tempest-tests-runner" Mar 18 17:09:59 crc kubenswrapper[4792]: E0318 17:09:59.253643 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="extract-utilities" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.253656 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="extract-utilities" Mar 18 17:09:59 crc kubenswrapper[4792]: E0318 17:09:59.253689 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="extract-content" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.253701 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="extract-content" Mar 18 17:09:59 crc kubenswrapper[4792]: E0318 17:09:59.253722 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="registry-server" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.253733 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="registry-server" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.254137 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="11070522-2520-4564-a02c-3bd460ae33fe" containerName="tempest-tests-tempest-tests-runner" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.254199 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="268ec89f-288e-470c-a074-9b32f1f2a03f" containerName="registry-server" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.255583 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.257789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ns7mj" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.277924 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.436189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"831cf454-c068-4800-97d9-9d78d3a35fad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.436262 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpvp\" (UniqueName: \"kubernetes.io/projected/831cf454-c068-4800-97d9-9d78d3a35fad-kube-api-access-7rpvp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"831cf454-c068-4800-97d9-9d78d3a35fad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.538699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"831cf454-c068-4800-97d9-9d78d3a35fad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.538791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpvp\" (UniqueName: \"kubernetes.io/projected/831cf454-c068-4800-97d9-9d78d3a35fad-kube-api-access-7rpvp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"831cf454-c068-4800-97d9-9d78d3a35fad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.540041 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"831cf454-c068-4800-97d9-9d78d3a35fad\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.575466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpvp\" (UniqueName: \"kubernetes.io/projected/831cf454-c068-4800-97d9-9d78d3a35fad-kube-api-access-7rpvp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"831cf454-c068-4800-97d9-9d78d3a35fad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.593397 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"831cf454-c068-4800-97d9-9d78d3a35fad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:09:59 crc kubenswrapper[4792]: I0318 17:09:59.890631 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.157828 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564230-5qhqg"] Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.163908 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.167946 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.168247 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.169597 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.193697 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-5qhqg"] Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.322757 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.323421 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.323561 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.326031 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.326167 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" gracePeriod=600 Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.356653 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.358414 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.361155 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqx5\" (UniqueName: \"kubernetes.io/projected/acd50d1f-7cd9-408e-884b-878dbef6ee28-kube-api-access-ppqx5\") pod \"auto-csr-approver-29564230-5qhqg\" (UID: \"acd50d1f-7cd9-408e-884b-878dbef6ee28\") " pod="openshift-infra/auto-csr-approver-29564230-5qhqg" Mar 18 17:10:00 crc kubenswrapper[4792]: E0318 17:10:00.447114 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.464598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqx5\" (UniqueName: \"kubernetes.io/projected/acd50d1f-7cd9-408e-884b-878dbef6ee28-kube-api-access-ppqx5\") pod \"auto-csr-approver-29564230-5qhqg\" (UID: \"acd50d1f-7cd9-408e-884b-878dbef6ee28\") " pod="openshift-infra/auto-csr-approver-29564230-5qhqg" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.487139 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqx5\" (UniqueName: \"kubernetes.io/projected/acd50d1f-7cd9-408e-884b-878dbef6ee28-kube-api-access-ppqx5\") pod \"auto-csr-approver-29564230-5qhqg\" (UID: \"acd50d1f-7cd9-408e-884b-878dbef6ee28\") " pod="openshift-infra/auto-csr-approver-29564230-5qhqg" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.492910 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.823675 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" exitCode=0 Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.823738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8"} Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.824181 4792 scope.go:117] "RemoveContainer" containerID="35126d0f2b2005c3e3c76c00acfb58aebc53a6fafa2e1c9419db524bb039de0a" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.824934 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:10:00 crc kubenswrapper[4792]: E0318 17:10:00.825432 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.825830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"831cf454-c068-4800-97d9-9d78d3a35fad","Type":"ContainerStarted","Data":"c753a1f3f69a629e96e19af6520eb46bdf4fc7ecc2a1e97fd55f9fda972f817d"} Mar 18 17:10:00 crc kubenswrapper[4792]: I0318 17:10:00.954805 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-5qhqg"] Mar 18 17:10:00 crc kubenswrapper[4792]: W0318 17:10:00.955617 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacd50d1f_7cd9_408e_884b_878dbef6ee28.slice/crio-dcb5760f68fd00731788553af2e6a1dacf40009c927d36ab23c902aa41aef4a2 WatchSource:0}: Error finding container dcb5760f68fd00731788553af2e6a1dacf40009c927d36ab23c902aa41aef4a2: Status 404 returned error can't find the container with id dcb5760f68fd00731788553af2e6a1dacf40009c927d36ab23c902aa41aef4a2 Mar 18 17:10:01 crc kubenswrapper[4792]: I0318 17:10:01.840394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"831cf454-c068-4800-97d9-9d78d3a35fad","Type":"ContainerStarted","Data":"002aa89f7ba562f11d8ff8ac20ecee4cff1e5ed8a8862a7d6272add1adad9537"} Mar 18 17:10:01 crc kubenswrapper[4792]: I0318 17:10:01.842212 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" event={"ID":"acd50d1f-7cd9-408e-884b-878dbef6ee28","Type":"ContainerStarted","Data":"dcb5760f68fd00731788553af2e6a1dacf40009c927d36ab23c902aa41aef4a2"} Mar 18 17:10:01 crc kubenswrapper[4792]: I0318 17:10:01.867682 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.660792415 podStartE2EDuration="2.867666028s" podCreationTimestamp="2026-03-18 17:09:59 +0000 UTC" firstStartedPulling="2026-03-18 17:10:00.358152197 +0000 UTC m=+5749.227481134" lastFinishedPulling="2026-03-18 17:10:01.56502581 +0000 UTC m=+5750.434354747" observedRunningTime="2026-03-18 17:10:01.857176066 +0000 UTC m=+5750.726505013" watchObservedRunningTime="2026-03-18 17:10:01.867666028 +0000 UTC m=+5750.736994965" Mar 18 17:10:02 crc kubenswrapper[4792]: I0318 17:10:02.855631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" event={"ID":"acd50d1f-7cd9-408e-884b-878dbef6ee28","Type":"ContainerStarted","Data":"cd12f958b06bc3c5fa631bd6214a7cd949d72a3561a55d25f668f1855d25fb21"} Mar 18 17:10:02 crc kubenswrapper[4792]: I0318 17:10:02.880681 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" podStartSLOduration=1.470049902 podStartE2EDuration="2.880655939s" podCreationTimestamp="2026-03-18 17:10:00 +0000 UTC" firstStartedPulling="2026-03-18 17:10:00.958129175 +0000 UTC m=+5749.827458112" lastFinishedPulling="2026-03-18 17:10:02.368735192 +0000 UTC m=+5751.238064149" observedRunningTime="2026-03-18 17:10:02.87246604 +0000 UTC m=+5751.741794977" watchObservedRunningTime="2026-03-18 17:10:02.880655939 +0000 UTC m=+5751.749984876" Mar 18 17:10:03 crc kubenswrapper[4792]: I0318 17:10:03.870280 4792 generic.go:334] "Generic (PLEG): container finished" podID="acd50d1f-7cd9-408e-884b-878dbef6ee28" containerID="cd12f958b06bc3c5fa631bd6214a7cd949d72a3561a55d25f668f1855d25fb21" exitCode=0 Mar 18 17:10:03 crc kubenswrapper[4792]: I0318 17:10:03.870386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" event={"ID":"acd50d1f-7cd9-408e-884b-878dbef6ee28","Type":"ContainerDied","Data":"cd12f958b06bc3c5fa631bd6214a7cd949d72a3561a55d25f668f1855d25fb21"} Mar 18 17:10:05 crc kubenswrapper[4792]: I0318 17:10:05.277123 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" Mar 18 17:10:05 crc kubenswrapper[4792]: I0318 17:10:05.382085 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppqx5\" (UniqueName: \"kubernetes.io/projected/acd50d1f-7cd9-408e-884b-878dbef6ee28-kube-api-access-ppqx5\") pod \"acd50d1f-7cd9-408e-884b-878dbef6ee28\" (UID: \"acd50d1f-7cd9-408e-884b-878dbef6ee28\") " Mar 18 17:10:05 crc kubenswrapper[4792]: I0318 17:10:05.393005 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd50d1f-7cd9-408e-884b-878dbef6ee28-kube-api-access-ppqx5" (OuterVolumeSpecName: "kube-api-access-ppqx5") pod "acd50d1f-7cd9-408e-884b-878dbef6ee28" (UID: "acd50d1f-7cd9-408e-884b-878dbef6ee28"). InnerVolumeSpecName "kube-api-access-ppqx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:10:05 crc kubenswrapper[4792]: I0318 17:10:05.485961 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppqx5\" (UniqueName: \"kubernetes.io/projected/acd50d1f-7cd9-408e-884b-878dbef6ee28-kube-api-access-ppqx5\") on node \"crc\" DevicePath \"\"" Mar 18 17:10:05 crc kubenswrapper[4792]: I0318 17:10:05.899653 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" event={"ID":"acd50d1f-7cd9-408e-884b-878dbef6ee28","Type":"ContainerDied","Data":"dcb5760f68fd00731788553af2e6a1dacf40009c927d36ab23c902aa41aef4a2"} Mar 18 17:10:05 crc kubenswrapper[4792]: I0318 17:10:05.900016 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb5760f68fd00731788553af2e6a1dacf40009c927d36ab23c902aa41aef4a2" Mar 18 17:10:05 crc kubenswrapper[4792]: I0318 17:10:05.899696 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564230-5qhqg" Mar 18 17:10:06 crc kubenswrapper[4792]: I0318 17:10:06.365614 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-jm7nf"] Mar 18 17:10:06 crc kubenswrapper[4792]: I0318 17:10:06.377091 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564224-jm7nf"] Mar 18 17:10:07 crc kubenswrapper[4792]: I0318 17:10:07.873475 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da3fb70-6060-4637-b00a-3591615d46e9" path="/var/lib/kubelet/pods/2da3fb70-6060-4637-b00a-3591615d46e9/volumes" Mar 18 17:10:11 crc kubenswrapper[4792]: I0318 17:10:11.868363 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:10:11 crc kubenswrapper[4792]: E0318 17:10:11.869659 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:10:22 crc kubenswrapper[4792]: I0318 17:10:22.854768 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:10:22 crc kubenswrapper[4792]: E0318 17:10:22.855481 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:10:33 crc kubenswrapper[4792]: I0318 17:10:33.855196 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:10:33 crc kubenswrapper[4792]: E0318 17:10:33.856113 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:10:48 crc kubenswrapper[4792]: I0318 17:10:48.854398 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:10:48 crc kubenswrapper[4792]: E0318 17:10:48.855353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.668063 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgh96/must-gather-c52ck"] Mar 18 17:10:55 crc kubenswrapper[4792]: E0318 17:10:55.669200 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd50d1f-7cd9-408e-884b-878dbef6ee28" containerName="oc" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.669217 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd50d1f-7cd9-408e-884b-878dbef6ee28" containerName="oc" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.669560 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd50d1f-7cd9-408e-884b-878dbef6ee28" containerName="oc" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.671289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.673899 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jgh96"/"openshift-service-ca.crt" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.673917 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jgh96"/"default-dockercfg-9rlc4" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.675171 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jgh96"/"kube-root-ca.crt" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.683396 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jgh96/must-gather-c52ck"] Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.695453 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q8lk\" (UniqueName: \"kubernetes.io/projected/c30591cd-5659-4403-b080-ace5f1d6d48f-kube-api-access-8q8lk\") pod \"must-gather-c52ck\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.695661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c30591cd-5659-4403-b080-ace5f1d6d48f-must-gather-output\") pod \"must-gather-c52ck\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.798397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q8lk\" (UniqueName: \"kubernetes.io/projected/c30591cd-5659-4403-b080-ace5f1d6d48f-kube-api-access-8q8lk\") pod \"must-gather-c52ck\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.798651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c30591cd-5659-4403-b080-ace5f1d6d48f-must-gather-output\") pod \"must-gather-c52ck\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.799202 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c30591cd-5659-4403-b080-ace5f1d6d48f-must-gather-output\") pod \"must-gather-c52ck\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.815443 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q8lk\" (UniqueName: \"kubernetes.io/projected/c30591cd-5659-4403-b080-ace5f1d6d48f-kube-api-access-8q8lk\") pod \"must-gather-c52ck\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:55 crc kubenswrapper[4792]: I0318 17:10:55.993512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:10:57 crc kubenswrapper[4792]: I0318 17:10:57.262203 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jgh96/must-gather-c52ck"] Mar 18 17:10:57 crc kubenswrapper[4792]: I0318 17:10:57.483765 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/must-gather-c52ck" event={"ID":"c30591cd-5659-4403-b080-ace5f1d6d48f","Type":"ContainerStarted","Data":"11b650037fc6b4cbe86559d966d683e1471ff6ce2f09eda15b0947936b0436cd"} Mar 18 17:10:59 crc kubenswrapper[4792]: I0318 17:10:59.703606 4792 scope.go:117] "RemoveContainer" containerID="749b2a8c6fa3bc6e71759415966dc1cc8a1e52930a5bdf2074dc19a90672dff3" Mar 18 17:10:59 crc kubenswrapper[4792]: I0318 17:10:59.854178 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:10:59 crc kubenswrapper[4792]: E0318 17:10:59.854676 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:11:04 crc kubenswrapper[4792]: I0318 17:11:04.571948 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/must-gather-c52ck" event={"ID":"c30591cd-5659-4403-b080-ace5f1d6d48f","Type":"ContainerStarted","Data":"6b5b187f7dbd99c72c88f7946b52817a171be0bafa419a17efa41f0cfabbbcac"} Mar 18 17:11:04 crc kubenswrapper[4792]: I0318 17:11:04.572660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/must-gather-c52ck" event={"ID":"c30591cd-5659-4403-b080-ace5f1d6d48f","Type":"ContainerStarted","Data":"747e41d0837bba9406f2e4d6962d91726b330295636a21c4a6aad3dfc15f2ec6"} Mar 18 17:11:04 crc kubenswrapper[4792]: I0318 17:11:04.593269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jgh96/must-gather-c52ck" podStartSLOduration=2.998776328 podStartE2EDuration="9.593248165s" podCreationTimestamp="2026-03-18 17:10:55 +0000 UTC" firstStartedPulling="2026-03-18 17:10:57.266665688 +0000 UTC m=+5806.135994635" lastFinishedPulling="2026-03-18 17:11:03.861137535 +0000 UTC m=+5812.730466472" observedRunningTime="2026-03-18 17:11:04.589311762 +0000 UTC m=+5813.458640719" watchObservedRunningTime="2026-03-18 17:11:04.593248165 +0000 UTC m=+5813.462577102" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.368024 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgh96/crc-debug-vvz6p"] Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.370782 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.487266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58681a02-c60b-46a0-945f-b54a8ef7fa7e-host\") pod \"crc-debug-vvz6p\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.487408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc6lj\" (UniqueName: \"kubernetes.io/projected/58681a02-c60b-46a0-945f-b54a8ef7fa7e-kube-api-access-kc6lj\") pod \"crc-debug-vvz6p\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.590707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58681a02-c60b-46a0-945f-b54a8ef7fa7e-host\") pod \"crc-debug-vvz6p\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.590834 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc6lj\" (UniqueName: \"kubernetes.io/projected/58681a02-c60b-46a0-945f-b54a8ef7fa7e-kube-api-access-kc6lj\") pod \"crc-debug-vvz6p\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.590858 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58681a02-c60b-46a0-945f-b54a8ef7fa7e-host\") pod \"crc-debug-vvz6p\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.631125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc6lj\" (UniqueName: \"kubernetes.io/projected/58681a02-c60b-46a0-945f-b54a8ef7fa7e-kube-api-access-kc6lj\") pod \"crc-debug-vvz6p\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: I0318 17:11:10.695092 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:11:10 crc kubenswrapper[4792]: W0318 17:11:10.746895 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58681a02_c60b_46a0_945f_b54a8ef7fa7e.slice/crio-2e44d314a57c1880cc072d224838f1e7bf5cc0a56c0f285363a74ad5c372f080 WatchSource:0}: Error finding container 2e44d314a57c1880cc072d224838f1e7bf5cc0a56c0f285363a74ad5c372f080: Status 404 returned error can't find the container with id 2e44d314a57c1880cc072d224838f1e7bf5cc0a56c0f285363a74ad5c372f080 Mar 18 17:11:11 crc kubenswrapper[4792]: I0318 17:11:11.650880 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" event={"ID":"58681a02-c60b-46a0-945f-b54a8ef7fa7e","Type":"ContainerStarted","Data":"2e44d314a57c1880cc072d224838f1e7bf5cc0a56c0f285363a74ad5c372f080"} Mar 18 17:11:12 crc kubenswrapper[4792]: I0318 17:11:12.854526 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:11:12 crc kubenswrapper[4792]: E0318 17:11:12.855229 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:11:23 crc kubenswrapper[4792]: I0318 17:11:23.801226 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" event={"ID":"58681a02-c60b-46a0-945f-b54a8ef7fa7e","Type":"ContainerStarted","Data":"aa150696d06100318360a05e099b07d133b4e3ffac9b066bda3f651d10330dbb"} Mar 18 17:11:23 crc kubenswrapper[4792]: I0318 17:11:23.819413 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" podStartSLOduration=1.412646505 podStartE2EDuration="13.819393288s" podCreationTimestamp="2026-03-18 17:11:10 +0000 UTC" firstStartedPulling="2026-03-18 17:11:10.749554135 +0000 UTC m=+5819.618883072" lastFinishedPulling="2026-03-18 17:11:23.156300918 +0000 UTC m=+5832.025629855" observedRunningTime="2026-03-18 17:11:23.815786474 +0000 UTC m=+5832.685115411" watchObservedRunningTime="2026-03-18 17:11:23.819393288 +0000 UTC m=+5832.688722225" Mar 18 17:11:25 crc kubenswrapper[4792]: I0318 17:11:25.855096 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:11:25 crc kubenswrapper[4792]: E0318 17:11:25.857048 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:11:37 crc kubenswrapper[4792]: I0318 17:11:37.855752 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:11:37 crc kubenswrapper[4792]: E0318 17:11:37.856609 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:11:48 crc kubenswrapper[4792]: I0318 17:11:48.855152 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:11:48 crc kubenswrapper[4792]: E0318 17:11:48.856238 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:11:59 crc kubenswrapper[4792]: I0318 17:11:59.855448 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:11:59 crc kubenswrapper[4792]: E0318 17:11:59.856323 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.150481 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564232-4nhvl"] Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.152455 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-4nhvl" Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.155307 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.155588 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.155766 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.164111 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-4nhvl"] Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.239474 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/913dd3c9-fc8b-499e-846e-4c9380c3df75-kube-api-access-p2hxc\") pod \"auto-csr-approver-29564232-4nhvl\" (UID: \"913dd3c9-fc8b-499e-846e-4c9380c3df75\") " pod="openshift-infra/auto-csr-approver-29564232-4nhvl" Mar 18 17:12:00 crc kubenswrapper[4792]: I0318 17:12:00.342864 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/913dd3c9-fc8b-499e-846e-4c9380c3df75-kube-api-access-p2hxc\") pod \"auto-csr-approver-29564232-4nhvl\" (UID: \"913dd3c9-fc8b-499e-846e-4c9380c3df75\") " pod="openshift-infra/auto-csr-approver-29564232-4nhvl" Mar 18 17:12:01 crc kubenswrapper[4792]: I0318 17:12:01.039886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/913dd3c9-fc8b-499e-846e-4c9380c3df75-kube-api-access-p2hxc\") pod \"auto-csr-approver-29564232-4nhvl\" (UID: \"913dd3c9-fc8b-499e-846e-4c9380c3df75\") " pod="openshift-infra/auto-csr-approver-29564232-4nhvl" Mar 18 17:12:01 crc kubenswrapper[4792]: I0318 17:12:01.079010 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-4nhvl" Mar 18 17:12:01 crc kubenswrapper[4792]: I0318 17:12:01.593331 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-4nhvl"] Mar 18 17:12:02 crc kubenswrapper[4792]: I0318 17:12:02.231277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564232-4nhvl" event={"ID":"913dd3c9-fc8b-499e-846e-4c9380c3df75","Type":"ContainerStarted","Data":"c75b7cdab4adb2910e0c831c27c9a68dc248ee19183cdac36579562dac382752"} Mar 18 17:12:04 crc kubenswrapper[4792]: I0318 17:12:04.268687 4792 generic.go:334] "Generic (PLEG): container finished" podID="913dd3c9-fc8b-499e-846e-4c9380c3df75" containerID="9dc6affc505286abc993646078ab80eec3c2e84f6160e493670e61f59e9b71a6" exitCode=0 Mar 18 17:12:04 crc kubenswrapper[4792]: I0318 17:12:04.268808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564232-4nhvl" event={"ID":"913dd3c9-fc8b-499e-846e-4c9380c3df75","Type":"ContainerDied","Data":"9dc6affc505286abc993646078ab80eec3c2e84f6160e493670e61f59e9b71a6"} Mar 18 17:12:05 crc kubenswrapper[4792]: I0318 17:12:05.710759 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-4nhvl" Mar 18 17:12:05 crc kubenswrapper[4792]: I0318 17:12:05.816465 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/913dd3c9-fc8b-499e-846e-4c9380c3df75-kube-api-access-p2hxc\") pod \"913dd3c9-fc8b-499e-846e-4c9380c3df75\" (UID: \"913dd3c9-fc8b-499e-846e-4c9380c3df75\") " Mar 18 17:12:05 crc kubenswrapper[4792]: I0318 17:12:05.823356 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913dd3c9-fc8b-499e-846e-4c9380c3df75-kube-api-access-p2hxc" (OuterVolumeSpecName: "kube-api-access-p2hxc") pod "913dd3c9-fc8b-499e-846e-4c9380c3df75" (UID: "913dd3c9-fc8b-499e-846e-4c9380c3df75"). InnerVolumeSpecName "kube-api-access-p2hxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:05 crc kubenswrapper[4792]: I0318 17:12:05.919578 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2hxc\" (UniqueName: \"kubernetes.io/projected/913dd3c9-fc8b-499e-846e-4c9380c3df75-kube-api-access-p2hxc\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:06 crc kubenswrapper[4792]: I0318 17:12:06.309264 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564232-4nhvl" event={"ID":"913dd3c9-fc8b-499e-846e-4c9380c3df75","Type":"ContainerDied","Data":"c75b7cdab4adb2910e0c831c27c9a68dc248ee19183cdac36579562dac382752"} Mar 18 17:12:06 crc kubenswrapper[4792]: I0318 17:12:06.309561 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c75b7cdab4adb2910e0c831c27c9a68dc248ee19183cdac36579562dac382752" Mar 18 17:12:06 crc kubenswrapper[4792]: I0318 17:12:06.309367 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564232-4nhvl" Mar 18 17:12:06 crc kubenswrapper[4792]: I0318 17:12:06.798222 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-6cqjd"] Mar 18 17:12:06 crc kubenswrapper[4792]: I0318 17:12:06.815837 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564226-6cqjd"] Mar 18 17:12:07 crc kubenswrapper[4792]: I0318 17:12:07.867779 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3de2dc-8a56-4b69-89ab-891290b2d254" path="/var/lib/kubelet/pods/ee3de2dc-8a56-4b69-89ab-891290b2d254/volumes" Mar 18 17:12:11 crc kubenswrapper[4792]: I0318 17:12:11.865910 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:12:11 crc kubenswrapper[4792]: E0318 17:12:11.866566 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:12:15 crc kubenswrapper[4792]: I0318 17:12:15.404765 4792 generic.go:334] "Generic (PLEG): container finished" podID="58681a02-c60b-46a0-945f-b54a8ef7fa7e" containerID="aa150696d06100318360a05e099b07d133b4e3ffac9b066bda3f651d10330dbb" exitCode=0 Mar 18 17:12:15 crc kubenswrapper[4792]: I0318 17:12:15.404838 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" event={"ID":"58681a02-c60b-46a0-945f-b54a8ef7fa7e","Type":"ContainerDied","Data":"aa150696d06100318360a05e099b07d133b4e3ffac9b066bda3f651d10330dbb"} Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.568170 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.613310 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgh96/crc-debug-vvz6p"] Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.624580 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgh96/crc-debug-vvz6p"] Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.699014 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc6lj\" (UniqueName: \"kubernetes.io/projected/58681a02-c60b-46a0-945f-b54a8ef7fa7e-kube-api-access-kc6lj\") pod \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.699139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58681a02-c60b-46a0-945f-b54a8ef7fa7e-host\") pod \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\" (UID: \"58681a02-c60b-46a0-945f-b54a8ef7fa7e\") " Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.699553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58681a02-c60b-46a0-945f-b54a8ef7fa7e-host" (OuterVolumeSpecName: "host") pod "58681a02-c60b-46a0-945f-b54a8ef7fa7e" (UID: "58681a02-c60b-46a0-945f-b54a8ef7fa7e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.700106 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58681a02-c60b-46a0-945f-b54a8ef7fa7e-host\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.705582 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58681a02-c60b-46a0-945f-b54a8ef7fa7e-kube-api-access-kc6lj" (OuterVolumeSpecName: "kube-api-access-kc6lj") pod "58681a02-c60b-46a0-945f-b54a8ef7fa7e" (UID: "58681a02-c60b-46a0-945f-b54a8ef7fa7e"). InnerVolumeSpecName "kube-api-access-kc6lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:16 crc kubenswrapper[4792]: I0318 17:12:16.803451 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc6lj\" (UniqueName: \"kubernetes.io/projected/58681a02-c60b-46a0-945f-b54a8ef7fa7e-kube-api-access-kc6lj\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.434170 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e44d314a57c1880cc072d224838f1e7bf5cc0a56c0f285363a74ad5c372f080" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.434532 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-vvz6p" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.796325 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgh96/crc-debug-79rzv"] Mar 18 17:12:17 crc kubenswrapper[4792]: E0318 17:12:17.797710 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58681a02-c60b-46a0-945f-b54a8ef7fa7e" containerName="container-00" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.797734 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="58681a02-c60b-46a0-945f-b54a8ef7fa7e" containerName="container-00" Mar 18 17:12:17 crc kubenswrapper[4792]: E0318 17:12:17.797751 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913dd3c9-fc8b-499e-846e-4c9380c3df75" containerName="oc" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.797759 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="913dd3c9-fc8b-499e-846e-4c9380c3df75" containerName="oc" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.798033 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="913dd3c9-fc8b-499e-846e-4c9380c3df75" containerName="oc" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.798054 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="58681a02-c60b-46a0-945f-b54a8ef7fa7e" containerName="container-00" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.798858 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.868354 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58681a02-c60b-46a0-945f-b54a8ef7fa7e" path="/var/lib/kubelet/pods/58681a02-c60b-46a0-945f-b54a8ef7fa7e/volumes" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.933655 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p725d\" (UniqueName: \"kubernetes.io/projected/cd010dce-f5a2-40f3-82db-cd6614375c75-kube-api-access-p725d\") pod \"crc-debug-79rzv\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:17 crc kubenswrapper[4792]: I0318 17:12:17.933996 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd010dce-f5a2-40f3-82db-cd6614375c75-host\") pod \"crc-debug-79rzv\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:18 crc kubenswrapper[4792]: I0318 17:12:18.035930 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p725d\" (UniqueName: \"kubernetes.io/projected/cd010dce-f5a2-40f3-82db-cd6614375c75-kube-api-access-p725d\") pod \"crc-debug-79rzv\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:18 crc kubenswrapper[4792]: I0318 17:12:18.037017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd010dce-f5a2-40f3-82db-cd6614375c75-host\") pod \"crc-debug-79rzv\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:18 crc kubenswrapper[4792]: I0318 17:12:18.037162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd010dce-f5a2-40f3-82db-cd6614375c75-host\") pod \"crc-debug-79rzv\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:18 crc kubenswrapper[4792]: I0318 17:12:18.057480 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p725d\" (UniqueName: \"kubernetes.io/projected/cd010dce-f5a2-40f3-82db-cd6614375c75-kube-api-access-p725d\") pod \"crc-debug-79rzv\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:18 crc kubenswrapper[4792]: I0318 17:12:18.118632 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:18 crc kubenswrapper[4792]: I0318 17:12:18.455881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-79rzv" event={"ID":"cd010dce-f5a2-40f3-82db-cd6614375c75","Type":"ContainerStarted","Data":"267ae0ecdfc685e42619351116b77001487ac53a285e0e72c994199e024f5ee9"} Mar 18 17:12:19 crc kubenswrapper[4792]: I0318 17:12:19.466690 4792 generic.go:334] "Generic (PLEG): container finished" podID="cd010dce-f5a2-40f3-82db-cd6614375c75" containerID="bcabf6e9ea4d56a82cc0a34fd9b5fdbd2ccc8e832a44559a0915925d47aabdda" exitCode=0 Mar 18 17:12:19 crc kubenswrapper[4792]: I0318 17:12:19.467034 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-79rzv" event={"ID":"cd010dce-f5a2-40f3-82db-cd6614375c75","Type":"ContainerDied","Data":"bcabf6e9ea4d56a82cc0a34fd9b5fdbd2ccc8e832a44559a0915925d47aabdda"} Mar 18 17:12:20 crc kubenswrapper[4792]: I0318 17:12:20.625514 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:20 crc kubenswrapper[4792]: I0318 17:12:20.694927 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p725d\" (UniqueName: \"kubernetes.io/projected/cd010dce-f5a2-40f3-82db-cd6614375c75-kube-api-access-p725d\") pod \"cd010dce-f5a2-40f3-82db-cd6614375c75\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " Mar 18 17:12:20 crc kubenswrapper[4792]: I0318 17:12:20.695307 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd010dce-f5a2-40f3-82db-cd6614375c75-host\") pod \"cd010dce-f5a2-40f3-82db-cd6614375c75\" (UID: \"cd010dce-f5a2-40f3-82db-cd6614375c75\") " Mar 18 17:12:20 crc kubenswrapper[4792]: I0318 17:12:20.695471 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd010dce-f5a2-40f3-82db-cd6614375c75-host" (OuterVolumeSpecName: "host") pod "cd010dce-f5a2-40f3-82db-cd6614375c75" (UID: "cd010dce-f5a2-40f3-82db-cd6614375c75"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:12:20 crc kubenswrapper[4792]: I0318 17:12:20.695926 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd010dce-f5a2-40f3-82db-cd6614375c75-host\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:20 crc kubenswrapper[4792]: I0318 17:12:20.700620 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd010dce-f5a2-40f3-82db-cd6614375c75-kube-api-access-p725d" (OuterVolumeSpecName: "kube-api-access-p725d") pod "cd010dce-f5a2-40f3-82db-cd6614375c75" (UID: "cd010dce-f5a2-40f3-82db-cd6614375c75"). InnerVolumeSpecName "kube-api-access-p725d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:20 crc kubenswrapper[4792]: I0318 17:12:20.798131 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p725d\" (UniqueName: \"kubernetes.io/projected/cd010dce-f5a2-40f3-82db-cd6614375c75-kube-api-access-p725d\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:21 crc kubenswrapper[4792]: I0318 17:12:21.497470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-79rzv" event={"ID":"cd010dce-f5a2-40f3-82db-cd6614375c75","Type":"ContainerDied","Data":"267ae0ecdfc685e42619351116b77001487ac53a285e0e72c994199e024f5ee9"} Mar 18 17:12:21 crc kubenswrapper[4792]: I0318 17:12:21.497813 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="267ae0ecdfc685e42619351116b77001487ac53a285e0e72c994199e024f5ee9" Mar 18 17:12:21 crc kubenswrapper[4792]: I0318 17:12:21.497510 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-79rzv" Mar 18 17:12:21 crc kubenswrapper[4792]: I0318 17:12:21.968839 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgh96/crc-debug-79rzv"] Mar 18 17:12:21 crc kubenswrapper[4792]: I0318 17:12:21.983838 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgh96/crc-debug-79rzv"] Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.159682 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jgh96/crc-debug-kwsq5"] Mar 18 17:12:23 crc kubenswrapper[4792]: E0318 17:12:23.161069 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd010dce-f5a2-40f3-82db-cd6614375c75" containerName="container-00" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.161102 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd010dce-f5a2-40f3-82db-cd6614375c75" containerName="container-00" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.161361 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd010dce-f5a2-40f3-82db-cd6614375c75" containerName="container-00" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.162251 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.257730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cg9c\" (UniqueName: \"kubernetes.io/projected/c34e6919-2028-420e-b978-7ef420d10bb0-kube-api-access-2cg9c\") pod \"crc-debug-kwsq5\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.258314 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c34e6919-2028-420e-b978-7ef420d10bb0-host\") pod \"crc-debug-kwsq5\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.360904 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c34e6919-2028-420e-b978-7ef420d10bb0-host\") pod \"crc-debug-kwsq5\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.361034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cg9c\" (UniqueName: \"kubernetes.io/projected/c34e6919-2028-420e-b978-7ef420d10bb0-kube-api-access-2cg9c\") pod \"crc-debug-kwsq5\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.361722 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c34e6919-2028-420e-b978-7ef420d10bb0-host\") pod \"crc-debug-kwsq5\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.832349 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cg9c\" (UniqueName: \"kubernetes.io/projected/c34e6919-2028-420e-b978-7ef420d10bb0-kube-api-access-2cg9c\") pod \"crc-debug-kwsq5\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:23 crc kubenswrapper[4792]: I0318 17:12:23.870824 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd010dce-f5a2-40f3-82db-cd6614375c75" path="/var/lib/kubelet/pods/cd010dce-f5a2-40f3-82db-cd6614375c75/volumes" Mar 18 17:12:24 crc kubenswrapper[4792]: I0318 17:12:24.083401 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:24 crc kubenswrapper[4792]: I0318 17:12:24.530239 4792 generic.go:334] "Generic (PLEG): container finished" podID="c34e6919-2028-420e-b978-7ef420d10bb0" containerID="abf83e822a3221e7495a949383110f87f440f83a8692295d7d11dbca76f8aef9" exitCode=0 Mar 18 17:12:24 crc kubenswrapper[4792]: I0318 17:12:24.530323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-kwsq5" event={"ID":"c34e6919-2028-420e-b978-7ef420d10bb0","Type":"ContainerDied","Data":"abf83e822a3221e7495a949383110f87f440f83a8692295d7d11dbca76f8aef9"} Mar 18 17:12:24 crc kubenswrapper[4792]: I0318 17:12:24.530569 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/crc-debug-kwsq5" event={"ID":"c34e6919-2028-420e-b978-7ef420d10bb0","Type":"ContainerStarted","Data":"4acb19a19edd4a43caba116264d04401f8b775deaa172c03979c02fab2b4e461"} Mar 18 17:12:24 crc kubenswrapper[4792]: I0318 17:12:24.569877 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgh96/crc-debug-kwsq5"] Mar 18 17:12:24 crc kubenswrapper[4792]: I0318 17:12:24.580678 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgh96/crc-debug-kwsq5"] Mar 18 17:12:24 crc kubenswrapper[4792]: I0318 17:12:24.855120 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:12:24 crc kubenswrapper[4792]: E0318 17:12:24.855510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.057187 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.131023 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c34e6919-2028-420e-b978-7ef420d10bb0-host\") pod \"c34e6919-2028-420e-b978-7ef420d10bb0\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.131236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cg9c\" (UniqueName: \"kubernetes.io/projected/c34e6919-2028-420e-b978-7ef420d10bb0-kube-api-access-2cg9c\") pod \"c34e6919-2028-420e-b978-7ef420d10bb0\" (UID: \"c34e6919-2028-420e-b978-7ef420d10bb0\") " Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.131225 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34e6919-2028-420e-b978-7ef420d10bb0-host" (OuterVolumeSpecName: "host") pod "c34e6919-2028-420e-b978-7ef420d10bb0" (UID: "c34e6919-2028-420e-b978-7ef420d10bb0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.132136 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c34e6919-2028-420e-b978-7ef420d10bb0-host\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.137868 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34e6919-2028-420e-b978-7ef420d10bb0-kube-api-access-2cg9c" (OuterVolumeSpecName: "kube-api-access-2cg9c") pod "c34e6919-2028-420e-b978-7ef420d10bb0" (UID: "c34e6919-2028-420e-b978-7ef420d10bb0"). InnerVolumeSpecName "kube-api-access-2cg9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.234721 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cg9c\" (UniqueName: \"kubernetes.io/projected/c34e6919-2028-420e-b978-7ef420d10bb0-kube-api-access-2cg9c\") on node \"crc\" DevicePath \"\"" Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.554631 4792 scope.go:117] "RemoveContainer" containerID="abf83e822a3221e7495a949383110f87f440f83a8692295d7d11dbca76f8aef9" Mar 18 17:12:26 crc kubenswrapper[4792]: I0318 17:12:26.554681 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/crc-debug-kwsq5" Mar 18 17:12:27 crc kubenswrapper[4792]: I0318 17:12:27.869237 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34e6919-2028-420e-b978-7ef420d10bb0" path="/var/lib/kubelet/pods/c34e6919-2028-420e-b978-7ef420d10bb0/volumes" Mar 18 17:12:37 crc kubenswrapper[4792]: I0318 17:12:37.855353 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:12:37 crc kubenswrapper[4792]: E0318 17:12:37.856014 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:12:49 crc kubenswrapper[4792]: I0318 17:12:49.860751 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:12:49 crc kubenswrapper[4792]: E0318 17:12:49.861814 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:12:58 crc kubenswrapper[4792]: I0318 17:12:58.399455 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e51032f9-f6e1-4f72-9185-784c3acae24b/aodh-api/0.log" Mar 18 17:12:58 crc kubenswrapper[4792]: I0318 17:12:58.615666 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e51032f9-f6e1-4f72-9185-784c3acae24b/aodh-evaluator/0.log" Mar 18 17:12:58 crc kubenswrapper[4792]: I0318 17:12:58.677111 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e51032f9-f6e1-4f72-9185-784c3acae24b/aodh-listener/0.log" Mar 18 17:12:58 crc kubenswrapper[4792]: I0318 17:12:58.760999 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e51032f9-f6e1-4f72-9185-784c3acae24b/aodh-notifier/0.log" Mar 18 17:12:58 crc kubenswrapper[4792]: I0318 17:12:58.814951 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85d9bfc98-xffcv_7d8a710c-9e71-411c-b036-b4f01dc4d420/barbican-api/0.log" Mar 18 17:12:58 crc kubenswrapper[4792]: I0318 17:12:58.920437 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85d9bfc98-xffcv_7d8a710c-9e71-411c-b036-b4f01dc4d420/barbican-api-log/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.083746 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7bf8bcc748-g9kq5_442ae180-60ac-4d2c-92eb-b9a823ba74a9/barbican-keystone-listener/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.201802 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7bf8bcc748-g9kq5_442ae180-60ac-4d2c-92eb-b9a823ba74a9/barbican-keystone-listener-log/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.312894 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d8b96c4f5-87qzw_fbcb88bd-9c5a-4e8f-bbe2-0109d7751292/barbican-worker/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.358356 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d8b96c4f5-87qzw_fbcb88bd-9c5a-4e8f-bbe2-0109d7751292/barbican-worker-log/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.483648 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zggfc_e8427835-8b71-4705-91e2-d82092ec93f5/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.661808 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec93232a-54d0-42a4-a659-ed6fc86913c6/ceilometer-central-agent/1.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.758384 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec93232a-54d0-42a4-a659-ed6fc86913c6/ceilometer-notification-agent/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.805594 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec93232a-54d0-42a4-a659-ed6fc86913c6/ceilometer-central-agent/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.966630 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec93232a-54d0-42a4-a659-ed6fc86913c6/sg-core/0.log" Mar 18 17:12:59 crc kubenswrapper[4792]: I0318 17:12:59.986165 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ec93232a-54d0-42a4-a659-ed6fc86913c6/proxy-httpd/0.log" Mar 18 17:13:00 crc kubenswrapper[4792]: I0318 17:13:00.142262 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5f827383-b345-4dd5-958f-54a72cb634b7/cinder-api/0.log" Mar 18 17:13:00 crc kubenswrapper[4792]: I0318 17:13:00.295576 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5f827383-b345-4dd5-958f-54a72cb634b7/cinder-api-log/0.log" Mar 18 17:13:00 crc kubenswrapper[4792]: I0318 17:13:00.384842 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d41b9217-24bd-4b7c-98f7-04ec8ca9bf89/cinder-scheduler/1.log" Mar 18 17:13:00 crc kubenswrapper[4792]: I0318 17:13:00.424090 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d41b9217-24bd-4b7c-98f7-04ec8ca9bf89/cinder-scheduler/0.log" Mar 18 17:13:00 crc kubenswrapper[4792]: I0318 17:13:00.534184 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d41b9217-24bd-4b7c-98f7-04ec8ca9bf89/probe/0.log" Mar 18 17:13:00 crc kubenswrapper[4792]: I0318 17:13:00.906136 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-t584h_e5814716-d18e-49c1-8543-b99e741df9d9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:00 crc kubenswrapper[4792]: I0318 17:13:00.995062 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dqvmg_7cb772fb-4950-4c27-b7c5-28ca75682b99/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:01 crc kubenswrapper[4792]: I0318 17:13:01.338609 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-xp9vl_4e9dff47-8cda-4561-8f7e-d381ad180ea6/init/0.log" Mar 18 17:13:01 crc kubenswrapper[4792]: I0318 17:13:01.537277 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-xp9vl_4e9dff47-8cda-4561-8f7e-d381ad180ea6/init/0.log" Mar 18 17:13:01 crc kubenswrapper[4792]: I0318 17:13:01.614080 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-xp9vl_4e9dff47-8cda-4561-8f7e-d381ad180ea6/dnsmasq-dns/0.log" Mar 18 17:13:01 crc kubenswrapper[4792]: I0318 17:13:01.639518 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5qmjm_39d160c7-decc-4473-9c4f-f1282a927485/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:01 crc kubenswrapper[4792]: I0318 17:13:01.872660 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6b102e52-1964-4051-b1f3-e066c77b7919/glance-httpd/0.log" Mar 18 17:13:01 crc kubenswrapper[4792]: I0318 17:13:01.906442 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6b102e52-1964-4051-b1f3-e066c77b7919/glance-log/0.log" Mar 18 17:13:02 crc kubenswrapper[4792]: I0318 17:13:02.105897 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_419ab4d0-1257-4c7d-89de-2d2ebcee1a74/glance-httpd/0.log" Mar 18 17:13:02 crc kubenswrapper[4792]: I0318 17:13:02.133793 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_419ab4d0-1257-4c7d-89de-2d2ebcee1a74/glance-log/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.091049 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-58f688bc9b-9h9n4_a03aec86-430b-4209-8f15-f7fb97d58276/heat-api/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.111429 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-7d5dfb5b8b-2znh8_46138c03-275f-46ea-b4d5-2947fcfe979c/heat-engine/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.157381 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-77zpj_4620d8fc-6dea-47d8-9d4a-fd9e8ceb2116/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.207431 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-f8cb7866d-pk45f_c8032f2a-13e3-4463-bab4-1b1d850e4b06/heat-cfnapi/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.403955 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4dqtw_ea475c80-81c5-4bb6-937f-4a2a87d6d9e7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.576186 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29564161-5whqc_b8f22059-bc37-4a08-911c-f38b0b38b322/keystone-cron/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.648464 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29564221-pvzkd_cd3213a9-53e0-4373-b606-2e7166eb8e26/keystone-cron/0.log" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.804804 4792 scope.go:117] "RemoveContainer" containerID="868ad13e9f2c8f0be2b901432b09042159b31125d17674cdc1bc8ae2beb909a3" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.855358 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:13:03 crc kubenswrapper[4792]: E0318 17:13:03.858068 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:13:03 crc kubenswrapper[4792]: I0318 17:13:03.936845 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2b458d30-1f6c-4042-989d-71e39a0aece2/kube-state-metrics/0.log" Mar 18 17:13:04 crc kubenswrapper[4792]: I0318 17:13:04.209504 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wt24c_119672bf-abf7-4a5d-8aee-d3fde8085ed9/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:04 crc kubenswrapper[4792]: I0318 17:13:04.266479 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-7xk4j_2a780ff4-ddc1-4f8b-a4b1-f24062af5089/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:04 crc kubenswrapper[4792]: I0318 17:13:04.346874 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-69977cc675-l62x5_63e5fc07-9299-40b7-91c9-7a2442362d9a/keystone-api/0.log" Mar 18 17:13:04 crc kubenswrapper[4792]: I0318 17:13:04.521101 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_513befbc-4cbb-472a-9770-376700a8d1bb/mysqld-exporter/0.log" Mar 18 17:13:04 crc kubenswrapper[4792]: I0318 17:13:04.993459 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5f66cfdb67-9fjs4_21384247-2994-41b5-9e8e-10f0e31e5ea9/neutron-api/0.log" Mar 18 17:13:05 crc kubenswrapper[4792]: I0318 17:13:05.218037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-9rqdb_1c471f93-cba4-46c2-9bdf-cb58f530f1a6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:05 crc kubenswrapper[4792]: I0318 17:13:05.290666 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5f66cfdb67-9fjs4_21384247-2994-41b5-9e8e-10f0e31e5ea9/neutron-httpd/0.log" Mar 18 17:13:05 crc kubenswrapper[4792]: I0318 17:13:05.963707 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f0326f82-a981-420a-be10-4364a620bdfd/nova-cell0-conductor-conductor/0.log" Mar 18 17:13:06 crc kubenswrapper[4792]: I0318 17:13:06.293541 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b391e203-fca6-4bcb-870c-d04691525743/nova-cell1-conductor-conductor/0.log" Mar 18 17:13:06 crc kubenswrapper[4792]: I0318 17:13:06.326870 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d5b6299c-8e67-4a48-8dd3-ef558e0f7b23/nova-api-log/0.log" Mar 18 17:13:06 crc kubenswrapper[4792]: I0318 17:13:06.566691 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4411b06a-98e3-4eb2-bfa9-cf954a003e3a/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 17:13:06 crc kubenswrapper[4792]: I0318 17:13:06.651844 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-btjw7_bf85aca0-8253-402e-92dd-df87a3dbaf01/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:06 crc kubenswrapper[4792]: I0318 17:13:06.816506 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d5b6299c-8e67-4a48-8dd3-ef558e0f7b23/nova-api-api/0.log" Mar 18 17:13:06 crc kubenswrapper[4792]: I0318 17:13:06.919797 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_edb37f7a-3e7f-42f6-8f05-f89ea71a1f02/nova-metadata-log/0.log" Mar 18 17:13:07 crc kubenswrapper[4792]: I0318 17:13:07.224101 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f63e8923-c2dd-459a-8019-ae9fcdbe6f92/nova-scheduler-scheduler/0.log" Mar 18 17:13:07 crc kubenswrapper[4792]: I0318 17:13:07.519514 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_edb37f7a-3e7f-42f6-8f05-f89ea71a1f02/nova-metadata-metadata/0.log" Mar 18 17:13:07 crc kubenswrapper[4792]: I0318 17:13:07.842234 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a64098b6-eb41-40ef-8d9b-6dd69c107ee2/mysql-bootstrap/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.040022 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a64098b6-eb41-40ef-8d9b-6dd69c107ee2/mysql-bootstrap/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.087166 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a64098b6-eb41-40ef-8d9b-6dd69c107ee2/galera/1.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.136918 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a64098b6-eb41-40ef-8d9b-6dd69c107ee2/galera/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.236302 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4cc19e41-291b-4aa8-b862-2efc890cea99/memcached/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.274257 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38dfbae-0508-4b57-b5d8-d47fcdd35fd6/mysql-bootstrap/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.466448 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38dfbae-0508-4b57-b5d8-d47fcdd35fd6/mysql-bootstrap/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.565697 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38dfbae-0508-4b57-b5d8-d47fcdd35fd6/galera/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.566922 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38dfbae-0508-4b57-b5d8-d47fcdd35fd6/galera/1.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.574200 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2dfd536a-310d-4039-a397-2bcdcdc0c2c2/openstackclient/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.739312 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-v4226_bd4fe675-5de2-4d0f-88c4-611c24091ffa/openstack-network-exporter/0.log" Mar 18 17:13:08 crc kubenswrapper[4792]: I0318 17:13:08.742186 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m977k_b90ccac6-a973-4572-834a-f7215cfc72a7/ovn-controller/0.log" Mar 18 17:13:09 crc kubenswrapper[4792]: I0318 17:13:09.237011 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6xllm_e0f3bd33-05e2-4174-a371-af75ef9fdb7d/ovsdb-server-init/0.log" Mar 18 17:13:09 crc kubenswrapper[4792]: I0318 17:13:09.378841 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6xllm_e0f3bd33-05e2-4174-a371-af75ef9fdb7d/ovsdb-server/0.log" Mar 18 17:13:09 crc kubenswrapper[4792]: I0318 17:13:09.380422 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6xllm_e0f3bd33-05e2-4174-a371-af75ef9fdb7d/ovs-vswitchd/0.log" Mar 18 17:13:09 crc kubenswrapper[4792]: I0318 17:13:09.386010 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6xllm_e0f3bd33-05e2-4174-a371-af75ef9fdb7d/ovsdb-server-init/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.181830 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kgh74_8e02ebcc-002d-4a76-b1f6-12298f2beaa0/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.195667 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7c13a8c4-d4ee-4af5-95bd-c28a60350d14/openstack-network-exporter/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.203227 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7c13a8c4-d4ee-4af5-95bd-c28a60350d14/ovn-northd/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.379660 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b19f3e39-4198-4eef-bbe8-67e28fcef034/ovsdbserver-nb/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.382827 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b19f3e39-4198-4eef-bbe8-67e28fcef034/openstack-network-exporter/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.532899 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f119c5e-bb19-41d0-b87c-4962192e94e5/openstack-network-exporter/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.590400 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f119c5e-bb19-41d0-b87c-4962192e94e5/ovsdbserver-sb/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.817321 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-79db488c56-wbmrk_371480fb-9244-4210-9db0-30d2fffdc422/placement-api/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.843896 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-79db488c56-wbmrk_371480fb-9244-4210-9db0-30d2fffdc422/placement-log/0.log" Mar 18 17:13:10 crc kubenswrapper[4792]: I0318 17:13:10.870472 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb295773-c070-4d90-b351-cac7e8fa1017/init-config-reloader/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.042460 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb295773-c070-4d90-b351-cac7e8fa1017/init-config-reloader/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.050500 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb295773-c070-4d90-b351-cac7e8fa1017/prometheus/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.051366 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb295773-c070-4d90-b351-cac7e8fa1017/config-reloader/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.058632 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb295773-c070-4d90-b351-cac7e8fa1017/thanos-sidecar/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.244193 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_24b920d2-ca55-4f2d-a313-06cbe39c81b8/setup-container/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.449595 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_24b920d2-ca55-4f2d-a313-06cbe39c81b8/rabbitmq/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.460475 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_24b920d2-ca55-4f2d-a313-06cbe39c81b8/setup-container/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.502357 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b4b6514-5ee6-4653-acfc-b45efe6e7263/setup-container/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.645670 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b4b6514-5ee6-4653-acfc-b45efe6e7263/setup-container/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.709773 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_b2a1ad0b-1684-4f7b-a7f0-023c7a15286a/setup-container/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.743773 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7b4b6514-5ee6-4653-acfc-b45efe6e7263/rabbitmq/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.943343 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_3a2e23b3-06c8-41e9-94d3-fa6fe815e906/setup-container/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.952378 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_b2a1ad0b-1684-4f7b-a7f0-023c7a15286a/setup-container/0.log" Mar 18 17:13:11 crc kubenswrapper[4792]: I0318 17:13:11.977635 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_b2a1ad0b-1684-4f7b-a7f0-023c7a15286a/rabbitmq/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.136074 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_3a2e23b3-06c8-41e9-94d3-fa6fe815e906/setup-container/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.227086 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_3a2e23b3-06c8-41e9-94d3-fa6fe815e906/rabbitmq/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.261409 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qp6nq_df844859-17d2-4d51-9eb5-f4c9148267cb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.366288 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-bfckz_60e44911-c925-4e0a-bdb7-849994798535/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.508590 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s6jnn_c6394e15-1052-4fa6-9a74-bee5cce65ae7/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.536922 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xqcf5_b7c1917f-d843-42d7-9473-ab2f98a7edf6/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.685098 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sqqx6_41821b7e-a517-44da-8768-18f9246d5bc2/ssh-known-hosts-edpm-deployment/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.779855 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7984d74779-bxqk7_e19c85aa-c5a1-4d0d-99ff-cc9283e5252f/proxy-server/0.log" Mar 18 17:13:12 crc kubenswrapper[4792]: I0318 17:13:12.903529 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7984d74779-bxqk7_e19c85aa-c5a1-4d0d-99ff-cc9283e5252f/proxy-httpd/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.154592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rks8d_5670c2b9-9a80-4670-a2d2-0135fbb5a77d/swift-ring-rebalance/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.274912 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/account-auditor/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.384912 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/account-reaper/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.387149 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/account-replicator/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.446827 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/container-auditor/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.458423 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/account-server/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.516018 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/container-replicator/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.577331 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/container-server/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.626055 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/container-updater/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.690523 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/object-auditor/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.691778 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/object-expirer/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.770264 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/object-replicator/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.807040 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/object-server/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.856266 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/object-updater/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.898214 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/rsync/0.log" Mar 18 17:13:13 crc kubenswrapper[4792]: I0318 17:13:13.922888 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3c345496-7b4e-41f0-a5ae-4c503e452221/swift-recon-cron/0.log" Mar 18 17:13:14 crc kubenswrapper[4792]: I0318 17:13:14.102457 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wj7lp_c675fc82-9ad3-4eae-8918-00ab1d6fd06d/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:14 crc kubenswrapper[4792]: I0318 17:13:14.196544 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-cf4zt_416434fb-7fe4-4872-9d1e-8bb317a4eab1/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:14 crc kubenswrapper[4792]: I0318 17:13:14.360326 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_831cf454-c068-4800-97d9-9d78d3a35fad/test-operator-logs-container/0.log" Mar 18 17:13:14 crc kubenswrapper[4792]: I0318 17:13:14.494721 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vb5ph_be9de932-130e-4207-bc36-6c6527e63684/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 17:13:15 crc kubenswrapper[4792]: I0318 17:13:15.082298 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_11070522-2520-4564-a02c-3bd460ae33fe/tempest-tests-tempest-tests-runner/0.log" Mar 18 17:13:16 crc kubenswrapper[4792]: I0318 17:13:16.855480 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:13:16 crc kubenswrapper[4792]: E0318 17:13:16.856616 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:13:31 crc kubenswrapper[4792]: I0318 17:13:31.866855 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:13:31 crc kubenswrapper[4792]: E0318 17:13:31.867446 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:13:41 crc kubenswrapper[4792]: I0318 17:13:41.351844 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj_f550007c-836c-4db3-b4cc-4ad8ffca5264/util/0.log" Mar 18 17:13:41 crc kubenswrapper[4792]: I0318 17:13:41.570982 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj_f550007c-836c-4db3-b4cc-4ad8ffca5264/pull/0.log" Mar 18 17:13:41 crc kubenswrapper[4792]: I0318 17:13:41.597094 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj_f550007c-836c-4db3-b4cc-4ad8ffca5264/pull/0.log" Mar 18 17:13:41 crc kubenswrapper[4792]: I0318 17:13:41.614754 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj_f550007c-836c-4db3-b4cc-4ad8ffca5264/util/0.log" Mar 18 17:13:41 crc kubenswrapper[4792]: I0318 17:13:41.761370 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj_f550007c-836c-4db3-b4cc-4ad8ffca5264/util/0.log" Mar 18 17:13:41 crc kubenswrapper[4792]: I0318 17:13:41.773850 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj_f550007c-836c-4db3-b4cc-4ad8ffca5264/pull/0.log" Mar 18 17:13:41 crc kubenswrapper[4792]: I0318 17:13:41.793671 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d3ed97db3684ab3bb06c23abc950d2124d932b8e8fc10b2fc4452fd97pmzwj_f550007c-836c-4db3-b4cc-4ad8ffca5264/extract/0.log" Mar 18 17:13:42 crc kubenswrapper[4792]: I0318 17:13:42.221465 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-tls5q_6ccc988b-8909-4e90-b016-c94a1deb2de7/manager/0.log" Mar 18 17:13:42 crc kubenswrapper[4792]: I0318 17:13:42.571866 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-jtksk_aa2e6c5a-c94a-482a-aceb-156b1cc316d0/manager/0.log" Mar 18 17:13:42 crc kubenswrapper[4792]: I0318 17:13:42.853941 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:13:42 crc kubenswrapper[4792]: E0318 17:13:42.854355 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:13:42 crc kubenswrapper[4792]: I0318 17:13:42.948480 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-nvs5w_af02f706-b3e4-4c4d-af74-3ef8ef2cd6a9/manager/0.log" Mar 18 17:13:43 crc kubenswrapper[4792]: I0318 17:13:43.087391 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-26gf6_55d5f156-656e-4e2f-b368-e841124084d1/manager/0.log" Mar 18 17:13:43 crc kubenswrapper[4792]: I0318 17:13:43.669385 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-gc45m_d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512/manager/1.log" Mar 18 17:13:43 crc kubenswrapper[4792]: I0318 17:13:43.841728 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-chpgl_155eb4c3-aa63-4ec7-9824-1bef2045a68b/manager/0.log" Mar 18 17:13:43 crc kubenswrapper[4792]: I0318 17:13:43.907051 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-gc45m_d35e4a2f-55a6-4ff0-bc37-ccf5f47c2512/manager/0.log" Mar 18 17:13:44 crc kubenswrapper[4792]: I0318 17:13:44.268669 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-76b87776c9-mf8vn_dbab4be0-7cc0-4e9c-8f84-5c9fd04d4e3b/manager/0.log" Mar 18 17:13:44 crc kubenswrapper[4792]: I0318 17:13:44.491666 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-kmpvr_dd73a890-f234-415f-b99a-685059be7d48/manager/0.log" Mar 18 17:13:44 crc kubenswrapper[4792]: I0318 17:13:44.733222 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-xzpbw_3692a84a-23dc-4b6c-9c20-d97bd0e285d8/manager/0.log" Mar 18 17:13:44 crc kubenswrapper[4792]: I0318 17:13:44.754948 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-n4j9l_65722e7d-1557-437c-ae5c-383082933c8c/manager/0.log" Mar 18 17:13:45 crc kubenswrapper[4792]: I0318 17:13:45.068794 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-77v8x_a518542e-e1c4-4754-9031-d3f1571abb27/manager/0.log" Mar 18 17:13:45 crc kubenswrapper[4792]: I0318 17:13:45.178094 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-hpr6x_abc215c2-57eb-4c7a-b19d-0ed3ccd67001/manager/0.log" Mar 18 17:13:45 crc kubenswrapper[4792]: I0318 17:13:45.311858 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-5c65h_05dba0ab-e659-4e0c-8713-4eebeca6edba/manager/1.log" Mar 18 17:13:45 crc kubenswrapper[4792]: I0318 17:13:45.428494 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-5c65h_05dba0ab-e659-4e0c-8713-4eebeca6edba/manager/0.log" Mar 18 17:13:45 crc kubenswrapper[4792]: I0318 17:13:45.593379 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp_8cf0ba21-2c05-4e3d-8925-114487cc4998/manager/1.log" Mar 18 17:13:45 crc kubenswrapper[4792]: I0318 17:13:45.750749 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-cfpqp_8cf0ba21-2c05-4e3d-8925-114487cc4998/manager/0.log" Mar 18 17:13:46 crc kubenswrapper[4792]: I0318 17:13:46.148387 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7f795bfd45-wf9cm_407238a6-a2e5-420c-801b-8a4329eebadd/operator/0.log" Mar 18 17:13:46 crc kubenswrapper[4792]: I0318 17:13:46.193481 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gkmkk_cc92711e-be7a-4025-9077-cac9e5bc7df8/registry-server/0.log" Mar 18 17:13:46 crc kubenswrapper[4792]: I0318 17:13:46.504333 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-l27l2_a1327184-da65-478d-b7a7-15d0daa3ca95/manager/0.log" Mar 18 17:13:46 crc kubenswrapper[4792]: I0318 17:13:46.629379 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-fb96d_5e4dd350-9a5b-4626-8b3d-6b9c097b4be1/manager/0.log" Mar 18 17:13:46 crc kubenswrapper[4792]: I0318 17:13:46.835031 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-b44rd_f172d1b7-2345-4bf5-ba2e-c142f4f8c482/operator/0.log" Mar 18 17:13:46 crc kubenswrapper[4792]: I0318 17:13:46.989184 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-s4j4w_5bb05e8f-3780-4bd4-a504-1be6a2887d9f/manager/0.log" Mar 18 17:13:47 crc kubenswrapper[4792]: I0318 17:13:47.363778 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-b7zpr_fbcfdc60-25a6-41e2-8dc1-eb9093393808/manager/1.log" Mar 18 17:13:47 crc kubenswrapper[4792]: I0318 17:13:47.492390 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-b7zpr_fbcfdc60-25a6-41e2-8dc1-eb9093393808/manager/0.log" Mar 18 17:13:47 crc kubenswrapper[4792]: I0318 17:13:47.610189 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5df8f6d8b4-s75wc_eb5bab1d-63b4-4ae0-8dfe-734700253a4f/manager/0.log" Mar 18 17:13:47 crc kubenswrapper[4792]: I0318 17:13:47.711786 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-z4nr9_96809e41-8656-4095-a2f9-9d69c31efe61/manager/0.log" Mar 18 17:13:48 crc kubenswrapper[4792]: I0318 17:13:48.153130 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c79f466d7-95zwp_14667803-000a-4186-8eb1-da78ce4812a0/manager/0.log" Mar 18 17:13:54 crc kubenswrapper[4792]: I0318 17:13:54.642371 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-rpk5q_79896742-17fd-4960-ae5b-af3c83550a4e/manager/0.log" Mar 18 17:13:55 crc kubenswrapper[4792]: I0318 17:13:55.855931 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:13:55 crc kubenswrapper[4792]: E0318 17:13:55.856543 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.148431 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564234-d7r6l"] Mar 18 17:14:00 crc kubenswrapper[4792]: E0318 17:14:00.149424 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34e6919-2028-420e-b978-7ef420d10bb0" containerName="container-00" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.149438 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34e6919-2028-420e-b978-7ef420d10bb0" containerName="container-00" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.149691 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34e6919-2028-420e-b978-7ef420d10bb0" containerName="container-00" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.150512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.153763 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.154756 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.156086 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.196010 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-d7r6l"] Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.217558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzqwv\" (UniqueName: \"kubernetes.io/projected/65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d-kube-api-access-hzqwv\") pod \"auto-csr-approver-29564234-d7r6l\" (UID: \"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d\") " pod="openshift-infra/auto-csr-approver-29564234-d7r6l" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.319539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzqwv\" (UniqueName: \"kubernetes.io/projected/65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d-kube-api-access-hzqwv\") pod \"auto-csr-approver-29564234-d7r6l\" (UID: \"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d\") " pod="openshift-infra/auto-csr-approver-29564234-d7r6l" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.342503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzqwv\" (UniqueName: \"kubernetes.io/projected/65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d-kube-api-access-hzqwv\") pod \"auto-csr-approver-29564234-d7r6l\" (UID: \"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d\") " pod="openshift-infra/auto-csr-approver-29564234-d7r6l" Mar 18 17:14:00 crc kubenswrapper[4792]: I0318 17:14:00.482623 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" Mar 18 17:14:01 crc kubenswrapper[4792]: I0318 17:14:01.094036 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-d7r6l"] Mar 18 17:14:01 crc kubenswrapper[4792]: I0318 17:14:01.802340 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" event={"ID":"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d","Type":"ContainerStarted","Data":"25bd429bc00535a177a2e97cf4e40798f57c5773025584db8551b7f83f9f8998"} Mar 18 17:14:02 crc kubenswrapper[4792]: I0318 17:14:02.812959 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" event={"ID":"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d","Type":"ContainerStarted","Data":"0645656482ab0059fa311ddb6336d8a466f7ac1e1b3d4bf6b1d7225a76529ecf"} Mar 18 17:14:02 crc kubenswrapper[4792]: I0318 17:14:02.834655 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" podStartSLOduration=1.838444615 podStartE2EDuration="2.834625815s" podCreationTimestamp="2026-03-18 17:14:00 +0000 UTC" firstStartedPulling="2026-03-18 17:14:01.09829254 +0000 UTC m=+5989.967621477" lastFinishedPulling="2026-03-18 17:14:02.09447374 +0000 UTC m=+5990.963802677" observedRunningTime="2026-03-18 17:14:02.824627849 +0000 UTC m=+5991.693956806" watchObservedRunningTime="2026-03-18 17:14:02.834625815 +0000 UTC m=+5991.703954752" Mar 18 17:14:03 crc kubenswrapper[4792]: I0318 17:14:03.827712 4792 generic.go:334] "Generic (PLEG): container finished" podID="65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d" containerID="0645656482ab0059fa311ddb6336d8a466f7ac1e1b3d4bf6b1d7225a76529ecf" exitCode=0 Mar 18 17:14:03 crc kubenswrapper[4792]: I0318 17:14:03.828125 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" event={"ID":"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d","Type":"ContainerDied","Data":"0645656482ab0059fa311ddb6336d8a466f7ac1e1b3d4bf6b1d7225a76529ecf"} Mar 18 17:14:05 crc kubenswrapper[4792]: I0318 17:14:05.381014 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" Mar 18 17:14:05 crc kubenswrapper[4792]: I0318 17:14:05.558871 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzqwv\" (UniqueName: \"kubernetes.io/projected/65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d-kube-api-access-hzqwv\") pod \"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d\" (UID: \"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d\") " Mar 18 17:14:05 crc kubenswrapper[4792]: I0318 17:14:05.576559 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d-kube-api-access-hzqwv" (OuterVolumeSpecName: "kube-api-access-hzqwv") pod "65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d" (UID: "65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d"). InnerVolumeSpecName "kube-api-access-hzqwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:14:05 crc kubenswrapper[4792]: I0318 17:14:05.661829 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzqwv\" (UniqueName: \"kubernetes.io/projected/65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d-kube-api-access-hzqwv\") on node \"crc\" DevicePath \"\"" Mar 18 17:14:05 crc kubenswrapper[4792]: I0318 17:14:05.854275 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" Mar 18 17:14:05 crc kubenswrapper[4792]: I0318 17:14:05.870368 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564234-d7r6l" event={"ID":"65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d","Type":"ContainerDied","Data":"25bd429bc00535a177a2e97cf4e40798f57c5773025584db8551b7f83f9f8998"} Mar 18 17:14:05 crc kubenswrapper[4792]: I0318 17:14:05.870441 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25bd429bc00535a177a2e97cf4e40798f57c5773025584db8551b7f83f9f8998" Mar 18 17:14:06 crc kubenswrapper[4792]: I0318 17:14:06.458642 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-qqbff"] Mar 18 17:14:06 crc kubenswrapper[4792]: I0318 17:14:06.472508 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564228-qqbff"] Mar 18 17:14:07 crc kubenswrapper[4792]: I0318 17:14:07.867795 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f43cc3-e797-4876-b8b4-2bb3a0d16df0" path="/var/lib/kubelet/pods/d3f43cc3-e797-4876-b8b4-2bb3a0d16df0/volumes" Mar 18 17:14:08 crc kubenswrapper[4792]: I0318 17:14:08.854220 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:14:08 crc kubenswrapper[4792]: E0318 17:14:08.854807 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:14:09 crc kubenswrapper[4792]: I0318 17:14:09.970004 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zcnn4_b0f66845-303e-42cf-b091-be0ac57cba20/control-plane-machine-set-operator/0.log" Mar 18 17:14:10 crc kubenswrapper[4792]: I0318 17:14:10.179477 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nc8_d4c0c858-0632-49da-b6d4-1b2e9f84f690/kube-rbac-proxy/0.log" Mar 18 17:14:10 crc kubenswrapper[4792]: I0318 17:14:10.213144 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nc8_d4c0c858-0632-49da-b6d4-1b2e9f84f690/machine-api-operator/0.log" Mar 18 17:14:21 crc kubenswrapper[4792]: I0318 17:14:21.866650 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:14:21 crc kubenswrapper[4792]: E0318 17:14:21.867855 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:14:22 crc kubenswrapper[4792]: I0318 17:14:22.394241 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-k4xld_4766d1f1-5397-48cd-8429-daa6bc26a860/cert-manager-controller/0.log" Mar 18 17:14:22 crc kubenswrapper[4792]: I0318 17:14:22.581480 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-dqww9_4e1f9db4-54f6-4217-a3cb-e9ea440f186e/cert-manager-cainjector/0.log" Mar 18 17:14:22 crc kubenswrapper[4792]: I0318 17:14:22.632868 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dbbd4_7d4badb4-1388-47c8-aed9-f8478388af41/cert-manager-webhook/0.log" Mar 18 17:14:35 crc kubenswrapper[4792]: I0318 17:14:35.854509 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:14:35 crc kubenswrapper[4792]: E0318 17:14:35.855388 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:14:35 crc kubenswrapper[4792]: I0318 17:14:35.963076 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-g4ld2_910601c0-aac3-4fe1-9735-90b6329e26c3/nmstate-console-plugin/0.log" Mar 18 17:14:36 crc kubenswrapper[4792]: I0318 17:14:36.154695 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s4q9g_c7af6f36-f51d-4d49-85d2-5d4081ad57a6/nmstate-handler/0.log" Mar 18 17:14:36 crc kubenswrapper[4792]: I0318 17:14:36.210479 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-kmt7t_ae143031-3c99-45c6-a0bf-6e8b8a3c1d14/kube-rbac-proxy/0.log" Mar 18 17:14:36 crc kubenswrapper[4792]: I0318 17:14:36.358322 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-kmt7t_ae143031-3c99-45c6-a0bf-6e8b8a3c1d14/nmstate-metrics/0.log" Mar 18 17:14:36 crc kubenswrapper[4792]: I0318 17:14:36.361117 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-mskk6_465e5e04-b3e9-4b8c-98dc-abd9f050de38/nmstate-operator/0.log" Mar 18 17:14:36 crc kubenswrapper[4792]: I0318 17:14:36.491881 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rlx82_7af72a3d-98a7-4a83-affa-3d382184fc59/nmstate-webhook/0.log" Mar 18 17:14:49 crc kubenswrapper[4792]: I0318 17:14:49.932675 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c8cdd9f9f-lv5d4_2c438c99-c0c4-43ec-a5e7-33a18425e63f/kube-rbac-proxy/0.log" Mar 18 17:14:49 crc kubenswrapper[4792]: I0318 17:14:49.942786 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c8cdd9f9f-lv5d4_2c438c99-c0c4-43ec-a5e7-33a18425e63f/manager/1.log" Mar 18 17:14:50 crc kubenswrapper[4792]: I0318 17:14:50.134211 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c8cdd9f9f-lv5d4_2c438c99-c0c4-43ec-a5e7-33a18425e63f/manager/0.log" Mar 18 17:14:50 crc kubenswrapper[4792]: I0318 17:14:50.855268 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:14:50 crc kubenswrapper[4792]: E0318 17:14:50.855613 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.492701 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-prmz7"] Mar 18 17:14:55 crc kubenswrapper[4792]: E0318 17:14:55.493778 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d" containerName="oc" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.493794 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d" containerName="oc" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.494050 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d" containerName="oc" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.498693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.508001 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prmz7"] Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.587024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-catalog-content\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.587416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hldln\" (UniqueName: \"kubernetes.io/projected/94e52bd1-4439-45db-a400-05fbb1aa778a-kube-api-access-hldln\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.587615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-utilities\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.689670 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-catalog-content\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.690121 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hldln\" (UniqueName: \"kubernetes.io/projected/94e52bd1-4439-45db-a400-05fbb1aa778a-kube-api-access-hldln\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.690229 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-utilities\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.690278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-catalog-content\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.690671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-utilities\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.713674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hldln\" (UniqueName: \"kubernetes.io/projected/94e52bd1-4439-45db-a400-05fbb1aa778a-kube-api-access-hldln\") pod \"redhat-marketplace-prmz7\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:55 crc kubenswrapper[4792]: I0318 17:14:55.833759 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:14:56 crc kubenswrapper[4792]: W0318 17:14:56.441019 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e52bd1_4439_45db_a400_05fbb1aa778a.slice/crio-3853662e37e4ea4e56559fbfb8285d5ef27bac4c279404b90133d7cb5428b9e8 WatchSource:0}: Error finding container 3853662e37e4ea4e56559fbfb8285d5ef27bac4c279404b90133d7cb5428b9e8: Status 404 returned error can't find the container with id 3853662e37e4ea4e56559fbfb8285d5ef27bac4c279404b90133d7cb5428b9e8 Mar 18 17:14:56 crc kubenswrapper[4792]: I0318 17:14:56.443012 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prmz7"] Mar 18 17:14:57 crc kubenswrapper[4792]: I0318 17:14:57.420925 4792 generic.go:334] "Generic (PLEG): container finished" podID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerID="eb83cfe5da9e28ef0be509ded19549f7201736923905226c544e872cab25f571" exitCode=0 Mar 18 17:14:57 crc kubenswrapper[4792]: I0318 17:14:57.421043 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prmz7" event={"ID":"94e52bd1-4439-45db-a400-05fbb1aa778a","Type":"ContainerDied","Data":"eb83cfe5da9e28ef0be509ded19549f7201736923905226c544e872cab25f571"} Mar 18 17:14:57 crc kubenswrapper[4792]: I0318 17:14:57.421278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prmz7" event={"ID":"94e52bd1-4439-45db-a400-05fbb1aa778a","Type":"ContainerStarted","Data":"3853662e37e4ea4e56559fbfb8285d5ef27bac4c279404b90133d7cb5428b9e8"} Mar 18 17:14:59 crc kubenswrapper[4792]: I0318 17:14:59.444245 4792 generic.go:334] "Generic (PLEG): container finished" podID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerID="9b89b645971954746e0ce684d939dcf71e7af544b5d6c3bc88af9454502ff5d4" exitCode=0 Mar 18 17:14:59 crc kubenswrapper[4792]: I0318 17:14:59.444333 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prmz7" event={"ID":"94e52bd1-4439-45db-a400-05fbb1aa778a","Type":"ContainerDied","Data":"9b89b645971954746e0ce684d939dcf71e7af544b5d6c3bc88af9454502ff5d4"} Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.171645 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc"] Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.174964 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.179427 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.183628 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.189585 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc"] Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.304892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d4d788-5d4d-468d-b308-b2ce1d936666-secret-volume\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.305206 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2w6m\" (UniqueName: \"kubernetes.io/projected/e2d4d788-5d4d-468d-b308-b2ce1d936666-kube-api-access-k2w6m\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.305225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d4d788-5d4d-468d-b308-b2ce1d936666-config-volume\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.409132 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d4d788-5d4d-468d-b308-b2ce1d936666-secret-volume\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.409222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2w6m\" (UniqueName: \"kubernetes.io/projected/e2d4d788-5d4d-468d-b308-b2ce1d936666-kube-api-access-k2w6m\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.409250 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d4d788-5d4d-468d-b308-b2ce1d936666-config-volume\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.410788 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d4d788-5d4d-468d-b308-b2ce1d936666-config-volume\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.417132 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d4d788-5d4d-468d-b308-b2ce1d936666-secret-volume\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.428435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2w6m\" (UniqueName: \"kubernetes.io/projected/e2d4d788-5d4d-468d-b308-b2ce1d936666-kube-api-access-k2w6m\") pod \"collect-profiles-29564235-mzcxc\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.461521 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prmz7" event={"ID":"94e52bd1-4439-45db-a400-05fbb1aa778a","Type":"ContainerStarted","Data":"a04eddaa24faa01455c56a7d468805dcb010be3c41d31ad23164d88d9fbfd61e"} Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.484756 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-prmz7" podStartSLOduration=2.745579662 podStartE2EDuration="5.484732556s" podCreationTimestamp="2026-03-18 17:14:55 +0000 UTC" firstStartedPulling="2026-03-18 17:14:57.423635005 +0000 UTC m=+6046.292963942" lastFinishedPulling="2026-03-18 17:15:00.162787899 +0000 UTC m=+6049.032116836" observedRunningTime="2026-03-18 17:15:00.478248632 +0000 UTC m=+6049.347577579" watchObservedRunningTime="2026-03-18 17:15:00.484732556 +0000 UTC m=+6049.354061493" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.503537 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:00 crc kubenswrapper[4792]: I0318 17:15:00.996409 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc"] Mar 18 17:15:00 crc kubenswrapper[4792]: W0318 17:15:00.998321 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2d4d788_5d4d_468d_b308_b2ce1d936666.slice/crio-33ecef2183a69c112d8a7117e4017d672176542a324b9d081dc81bb5c3431262 WatchSource:0}: Error finding container 33ecef2183a69c112d8a7117e4017d672176542a324b9d081dc81bb5c3431262: Status 404 returned error can't find the container with id 33ecef2183a69c112d8a7117e4017d672176542a324b9d081dc81bb5c3431262 Mar 18 17:15:01 crc kubenswrapper[4792]: I0318 17:15:01.476102 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" event={"ID":"e2d4d788-5d4d-468d-b308-b2ce1d936666","Type":"ContainerStarted","Data":"7cb8e5ae93ec867e4516253027b69db4f50a9c70eb93421e9c377ca62cb613a9"} Mar 18 17:15:01 crc kubenswrapper[4792]: I0318 17:15:01.476433 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" event={"ID":"e2d4d788-5d4d-468d-b308-b2ce1d936666","Type":"ContainerStarted","Data":"33ecef2183a69c112d8a7117e4017d672176542a324b9d081dc81bb5c3431262"} Mar 18 17:15:01 crc kubenswrapper[4792]: I0318 17:15:01.520871 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" podStartSLOduration=1.520849157 podStartE2EDuration="1.520849157s" podCreationTimestamp="2026-03-18 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:15:01.509451617 +0000 UTC m=+6050.378780564" watchObservedRunningTime="2026-03-18 17:15:01.520849157 +0000 UTC m=+6050.390178094" Mar 18 17:15:01 crc kubenswrapper[4792]: I0318 17:15:01.866867 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:15:02 crc kubenswrapper[4792]: I0318 17:15:02.498687 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2d4d788-5d4d-468d-b308-b2ce1d936666" containerID="7cb8e5ae93ec867e4516253027b69db4f50a9c70eb93421e9c377ca62cb613a9" exitCode=0 Mar 18 17:15:02 crc kubenswrapper[4792]: I0318 17:15:02.499360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" event={"ID":"e2d4d788-5d4d-468d-b308-b2ce1d936666","Type":"ContainerDied","Data":"7cb8e5ae93ec867e4516253027b69db4f50a9c70eb93421e9c377ca62cb613a9"} Mar 18 17:15:02 crc kubenswrapper[4792]: I0318 17:15:02.545163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"74500b43621b4500ad54de67c8edb80af58c911e029926923b3f0201ec20df9a"} Mar 18 17:15:02 crc kubenswrapper[4792]: E0318 17:15:02.731738 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2d4d788_5d4d_468d_b308_b2ce1d936666.slice/crio-conmon-7cb8e5ae93ec867e4516253027b69db4f50a9c70eb93421e9c377ca62cb613a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2d4d788_5d4d_468d_b308_b2ce1d936666.slice/crio-7cb8e5ae93ec867e4516253027b69db4f50a9c70eb93421e9c377ca62cb613a9.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:15:03 crc kubenswrapper[4792]: I0318 17:15:03.961053 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-7m5sb_58802970-175f-48a9-aa0b-25cbd849fecf/prometheus-operator/0.log" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.000908 4792 scope.go:117] "RemoveContainer" containerID="0cd3debd5cd8c9fd84eeb59678554d92da41752100c8879013306eb772ada716" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.213162 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.267486 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_997642b8-111c-438c-906c-ace1a270f33b/prometheus-operator-admission-webhook/0.log" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.281010 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_d350c21d-f3fd-4b9e-a5f2-d7172fb87714/prometheus-operator-admission-webhook/0.log" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.317679 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d4d788-5d4d-468d-b308-b2ce1d936666-config-volume\") pod \"e2d4d788-5d4d-468d-b308-b2ce1d936666\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.317740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2w6m\" (UniqueName: \"kubernetes.io/projected/e2d4d788-5d4d-468d-b308-b2ce1d936666-kube-api-access-k2w6m\") pod \"e2d4d788-5d4d-468d-b308-b2ce1d936666\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.318013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d4d788-5d4d-468d-b308-b2ce1d936666-secret-volume\") pod \"e2d4d788-5d4d-468d-b308-b2ce1d936666\" (UID: \"e2d4d788-5d4d-468d-b308-b2ce1d936666\") " Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.318911 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2d4d788-5d4d-468d-b308-b2ce1d936666-config-volume" (OuterVolumeSpecName: "config-volume") pod "e2d4d788-5d4d-468d-b308-b2ce1d936666" (UID: "e2d4d788-5d4d-468d-b308-b2ce1d936666"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.320369 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d4d788-5d4d-468d-b308-b2ce1d936666-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.331921 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d4d788-5d4d-468d-b308-b2ce1d936666-kube-api-access-k2w6m" (OuterVolumeSpecName: "kube-api-access-k2w6m") pod "e2d4d788-5d4d-468d-b308-b2ce1d936666" (UID: "e2d4d788-5d4d-468d-b308-b2ce1d936666"). InnerVolumeSpecName "kube-api-access-k2w6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.342481 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d4d788-5d4d-468d-b308-b2ce1d936666-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e2d4d788-5d4d-468d-b308-b2ce1d936666" (UID: "e2d4d788-5d4d-468d-b308-b2ce1d936666"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.422915 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d4d788-5d4d-468d-b308-b2ce1d936666-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.422946 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2w6m\" (UniqueName: \"kubernetes.io/projected/e2d4d788-5d4d-468d-b308-b2ce1d936666-kube-api-access-k2w6m\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.508960 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-ct7p5_675f6ffb-b144-4efc-b47a-81c748cb4765/observability-ui-dashboards/0.log" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.538876 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-x5w94_255ea945-6e83-4ead-b609-b47a6b5eaafa/operator/0.log" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.580846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" event={"ID":"e2d4d788-5d4d-468d-b308-b2ce1d936666","Type":"ContainerDied","Data":"33ecef2183a69c112d8a7117e4017d672176542a324b9d081dc81bb5c3431262"} Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.580891 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ecef2183a69c112d8a7117e4017d672176542a324b9d081dc81bb5c3431262" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.580957 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564235-mzcxc" Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.611019 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5"] Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.638824 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564190-z2pc5"] Mar 18 17:15:04 crc kubenswrapper[4792]: I0318 17:15:04.796514 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-d9577b4dd-zfrmv_15bde542-1ffd-48b4-b2cf-98d98348920e/perses-operator/0.log" Mar 18 17:15:05 crc kubenswrapper[4792]: I0318 17:15:05.834936 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:15:05 crc kubenswrapper[4792]: I0318 17:15:05.835357 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:15:05 crc kubenswrapper[4792]: I0318 17:15:05.873359 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8b9640-2056-47b6-9982-b0feea515131" path="/var/lib/kubelet/pods/7a8b9640-2056-47b6-9982-b0feea515131/volumes" Mar 18 17:15:05 crc kubenswrapper[4792]: I0318 17:15:05.890353 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:15:06 crc kubenswrapper[4792]: I0318 17:15:06.656479 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:15:09 crc kubenswrapper[4792]: I0318 17:15:09.483962 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prmz7"] Mar 18 17:15:09 crc kubenswrapper[4792]: I0318 17:15:09.486300 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-prmz7" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="registry-server" containerID="cri-o://a04eddaa24faa01455c56a7d468805dcb010be3c41d31ad23164d88d9fbfd61e" gracePeriod=2 Mar 18 17:15:09 crc kubenswrapper[4792]: I0318 17:15:09.637955 4792 generic.go:334] "Generic (PLEG): container finished" podID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerID="a04eddaa24faa01455c56a7d468805dcb010be3c41d31ad23164d88d9fbfd61e" exitCode=0 Mar 18 17:15:09 crc kubenswrapper[4792]: I0318 17:15:09.638022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prmz7" event={"ID":"94e52bd1-4439-45db-a400-05fbb1aa778a","Type":"ContainerDied","Data":"a04eddaa24faa01455c56a7d468805dcb010be3c41d31ad23164d88d9fbfd61e"} Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.007327 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.060028 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hldln\" (UniqueName: \"kubernetes.io/projected/94e52bd1-4439-45db-a400-05fbb1aa778a-kube-api-access-hldln\") pod \"94e52bd1-4439-45db-a400-05fbb1aa778a\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.060284 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-catalog-content\") pod \"94e52bd1-4439-45db-a400-05fbb1aa778a\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.060356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-utilities\") pod \"94e52bd1-4439-45db-a400-05fbb1aa778a\" (UID: \"94e52bd1-4439-45db-a400-05fbb1aa778a\") " Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.061560 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-utilities" (OuterVolumeSpecName: "utilities") pod "94e52bd1-4439-45db-a400-05fbb1aa778a" (UID: "94e52bd1-4439-45db-a400-05fbb1aa778a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.067619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e52bd1-4439-45db-a400-05fbb1aa778a-kube-api-access-hldln" (OuterVolumeSpecName: "kube-api-access-hldln") pod "94e52bd1-4439-45db-a400-05fbb1aa778a" (UID: "94e52bd1-4439-45db-a400-05fbb1aa778a"). InnerVolumeSpecName "kube-api-access-hldln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.084877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94e52bd1-4439-45db-a400-05fbb1aa778a" (UID: "94e52bd1-4439-45db-a400-05fbb1aa778a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.163098 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.163377 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94e52bd1-4439-45db-a400-05fbb1aa778a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.163387 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hldln\" (UniqueName: \"kubernetes.io/projected/94e52bd1-4439-45db-a400-05fbb1aa778a-kube-api-access-hldln\") on node \"crc\" DevicePath \"\"" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.651319 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prmz7" event={"ID":"94e52bd1-4439-45db-a400-05fbb1aa778a","Type":"ContainerDied","Data":"3853662e37e4ea4e56559fbfb8285d5ef27bac4c279404b90133d7cb5428b9e8"} Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.651364 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prmz7" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.651377 4792 scope.go:117] "RemoveContainer" containerID="a04eddaa24faa01455c56a7d468805dcb010be3c41d31ad23164d88d9fbfd61e" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.686742 4792 scope.go:117] "RemoveContainer" containerID="9b89b645971954746e0ce684d939dcf71e7af544b5d6c3bc88af9454502ff5d4" Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.704712 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prmz7"] Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.720264 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-prmz7"] Mar 18 17:15:10 crc kubenswrapper[4792]: I0318 17:15:10.722378 4792 scope.go:117] "RemoveContainer" containerID="eb83cfe5da9e28ef0be509ded19549f7201736923905226c544e872cab25f571" Mar 18 17:15:11 crc kubenswrapper[4792]: I0318 17:15:11.872212 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" path="/var/lib/kubelet/pods/94e52bd1-4439-45db-a400-05fbb1aa778a/volumes" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.037605 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-zx4wt_3ff565a6-f95c-4656-be6c-cd52028bc42d/cluster-logging-operator/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.204671 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-pmw4h_9b98f4f7-9eac-4059-b29a-7bb4c79cf5ae/collector/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.308575 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_d9a6fb1e-3b68-4210-9322-e13a634fac2a/loki-compactor/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.411679 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-dfcx2_1c367fec-09d4-46fa-8900-0c508ced5de9/loki-distributor/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.510137 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-599d7cd94d-7f8hf_ff112f55-c823-4d01-a355-08279e6a0391/gateway/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.575402 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-599d7cd94d-7f8hf_ff112f55-c823-4d01-a355-08279e6a0391/opa/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.665477 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-599d7cd94d-c8sjl_e0054d36-2f0d-43c8-93d2-774d775a22ea/gateway/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.704930 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-599d7cd94d-c8sjl_e0054d36-2f0d-43c8-93d2-774d775a22ea/opa/0.log" Mar 18 17:15:19 crc kubenswrapper[4792]: I0318 17:15:19.840232 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_9530b94d-2bb9-4e98-832f-07c4d8b2277a/loki-index-gateway/0.log" Mar 18 17:15:20 crc kubenswrapper[4792]: I0318 17:15:20.006850 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_c6c59cfd-2add-4b4e-81c1-bacc77deae06/loki-ingester/0.log" Mar 18 17:15:20 crc kubenswrapper[4792]: I0318 17:15:20.075358 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-86w5q_f9dbb2aa-f06a-431d-b181-29315e9170cb/loki-querier/0.log" Mar 18 17:15:20 crc kubenswrapper[4792]: I0318 17:15:20.206292 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-gqm44_8927cd79-8eff-4f53-a676-782cbb366e9c/loki-query-frontend/0.log" Mar 18 17:15:33 crc kubenswrapper[4792]: I0318 17:15:33.499422 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-9k68t_260602fc-bedf-40ec-92e7-a96e3ee009f0/kube-rbac-proxy/0.log" Mar 18 17:15:33 crc kubenswrapper[4792]: I0318 17:15:33.716529 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-9k68t_260602fc-bedf-40ec-92e7-a96e3ee009f0/controller/0.log" Mar 18 17:15:33 crc kubenswrapper[4792]: I0318 17:15:33.759212 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-frr-files/0.log" Mar 18 17:15:33 crc kubenswrapper[4792]: I0318 17:15:33.981410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-frr-files/0.log" Mar 18 17:15:33 crc kubenswrapper[4792]: I0318 17:15:33.996711 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-metrics/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.002170 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-reloader/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.024752 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-reloader/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.196383 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-frr-files/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.219800 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-metrics/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.229019 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-metrics/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.240717 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-reloader/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.406469 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-frr-files/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.432650 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-reloader/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.443998 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/cp-metrics/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.453676 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/controller/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.647431 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/frr-metrics/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.672997 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/kube-rbac-proxy/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.907482 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/kube-rbac-proxy-frr/0.log" Mar 18 17:15:34 crc kubenswrapper[4792]: I0318 17:15:34.953219 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/reloader/0.log" Mar 18 17:15:35 crc kubenswrapper[4792]: I0318 17:15:35.113778 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/frr/1.log" Mar 18 17:15:35 crc kubenswrapper[4792]: I0318 17:15:35.214477 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-86pbc_423d82c6-fd0b-4cb5-8ff2-501f479a9a73/frr-k8s-webhook-server/0.log" Mar 18 17:15:35 crc kubenswrapper[4792]: I0318 17:15:35.410853 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7cfbbd978d-5f96z_ae1d2de8-ac87-4f0e-97c5-3bbb88279055/manager/1.log" Mar 18 17:15:35 crc kubenswrapper[4792]: I0318 17:15:35.517459 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7cfbbd978d-5f96z_ae1d2de8-ac87-4f0e-97c5-3bbb88279055/manager/0.log" Mar 18 17:15:35 crc kubenswrapper[4792]: I0318 17:15:35.643406 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78dddf6df5-kxk85_8bdbd945-a92a-471b-8a37-c999fe503caa/webhook-server/0.log" Mar 18 17:15:35 crc kubenswrapper[4792]: I0318 17:15:35.857486 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kgc9b_ac5ba665-4ead-4469-9d1c-c777bf26d579/kube-rbac-proxy/0.log" Mar 18 17:15:36 crc kubenswrapper[4792]: I0318 17:15:36.475444 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kgc9b_ac5ba665-4ead-4469-9d1c-c777bf26d579/speaker/0.log" Mar 18 17:15:36 crc kubenswrapper[4792]: I0318 17:15:36.799113 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kvj8m_fb6ddafa-95ff-43b2-be7b-352a7fab9d05/frr/0.log" Mar 18 17:15:48 crc kubenswrapper[4792]: I0318 17:15:48.682067 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx_a3864929-6390-4703-b97d-50451aae73fe/util/0.log" Mar 18 17:15:48 crc kubenswrapper[4792]: I0318 17:15:48.906659 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx_a3864929-6390-4703-b97d-50451aae73fe/util/0.log" Mar 18 17:15:48 crc kubenswrapper[4792]: I0318 17:15:48.927933 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx_a3864929-6390-4703-b97d-50451aae73fe/pull/0.log" Mar 18 17:15:48 crc kubenswrapper[4792]: I0318 17:15:48.958704 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx_a3864929-6390-4703-b97d-50451aae73fe/pull/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.129187 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx_a3864929-6390-4703-b97d-50451aae73fe/util/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.141223 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx_a3864929-6390-4703-b97d-50451aae73fe/pull/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.163833 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ss9lx_a3864929-6390-4703-b97d-50451aae73fe/extract/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.327532 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k_e6dbd030-bd91-4ac2-9140-3bc0bc5214b3/util/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.523911 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k_e6dbd030-bd91-4ac2-9140-3bc0bc5214b3/pull/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.547628 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k_e6dbd030-bd91-4ac2-9140-3bc0bc5214b3/util/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.555540 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k_e6dbd030-bd91-4ac2-9140-3bc0bc5214b3/pull/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.711809 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k_e6dbd030-bd91-4ac2-9140-3bc0bc5214b3/pull/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.741948 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k_e6dbd030-bd91-4ac2-9140-3bc0bc5214b3/extract/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.748504 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15hl8k_e6dbd030-bd91-4ac2-9140-3bc0bc5214b3/util/0.log" Mar 18 17:15:49 crc kubenswrapper[4792]: I0318 17:15:49.907134 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t_3245cd7d-9b25-4016-a86b-44e81a9e2fb5/util/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.082180 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t_3245cd7d-9b25-4016-a86b-44e81a9e2fb5/util/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.106756 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t_3245cd7d-9b25-4016-a86b-44e81a9e2fb5/pull/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.116438 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t_3245cd7d-9b25-4016-a86b-44e81a9e2fb5/pull/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.297307 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t_3245cd7d-9b25-4016-a86b-44e81a9e2fb5/pull/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.318499 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t_3245cd7d-9b25-4016-a86b-44e81a9e2fb5/util/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.368709 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5qgj6t_3245cd7d-9b25-4016-a86b-44e81a9e2fb5/extract/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.513372 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7_e0ba453e-25c7-4e11-a393-74a6a4ee6e56/util/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.666227 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7_e0ba453e-25c7-4e11-a393-74a6a4ee6e56/util/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.685410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7_e0ba453e-25c7-4e11-a393-74a6a4ee6e56/pull/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.685574 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7_e0ba453e-25c7-4e11-a393-74a6a4ee6e56/pull/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.846614 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7_e0ba453e-25c7-4e11-a393-74a6a4ee6e56/util/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.862274 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7_e0ba453e-25c7-4e11-a393-74a6a4ee6e56/pull/0.log" Mar 18 17:15:50 crc kubenswrapper[4792]: I0318 17:15:50.890198 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6ct2zz7_e0ba453e-25c7-4e11-a393-74a6a4ee6e56/extract/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.017272 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq_9334cbd7-09b7-446e-ba60-975e332a9630/util/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.225329 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq_9334cbd7-09b7-446e-ba60-975e332a9630/util/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.233808 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq_9334cbd7-09b7-446e-ba60-975e332a9630/pull/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.264530 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq_9334cbd7-09b7-446e-ba60-975e332a9630/pull/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.400509 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq_9334cbd7-09b7-446e-ba60-975e332a9630/pull/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.416747 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq_9334cbd7-09b7-446e-ba60-975e332a9630/util/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.433905 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kznrq_9334cbd7-09b7-446e-ba60-975e332a9630/extract/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.573356 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kzm75_dcf562be-1a5c-41e2-9355-706b833cb56e/extract-utilities/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.830369 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kzm75_dcf562be-1a5c-41e2-9355-706b833cb56e/extract-content/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.830737 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kzm75_dcf562be-1a5c-41e2-9355-706b833cb56e/extract-content/0.log" Mar 18 17:15:51 crc kubenswrapper[4792]: I0318 17:15:51.833271 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kzm75_dcf562be-1a5c-41e2-9355-706b833cb56e/extract-utilities/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.039432 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kzm75_dcf562be-1a5c-41e2-9355-706b833cb56e/extract-utilities/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.054126 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kzm75_dcf562be-1a5c-41e2-9355-706b833cb56e/extract-content/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.294095 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zw8n7_e5ef8d1c-3435-4dcb-8397-2314c8795c3b/extract-utilities/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.581684 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zw8n7_e5ef8d1c-3435-4dcb-8397-2314c8795c3b/extract-content/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.608682 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zw8n7_e5ef8d1c-3435-4dcb-8397-2314c8795c3b/extract-utilities/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.674533 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zw8n7_e5ef8d1c-3435-4dcb-8397-2314c8795c3b/extract-content/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.896673 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zw8n7_e5ef8d1c-3435-4dcb-8397-2314c8795c3b/extract-utilities/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.935912 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kzm75_dcf562be-1a5c-41e2-9355-706b833cb56e/registry-server/0.log" Mar 18 17:15:52 crc kubenswrapper[4792]: I0318 17:15:52.949516 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zw8n7_e5ef8d1c-3435-4dcb-8397-2314c8795c3b/extract-content/0.log" Mar 18 17:15:53 crc kubenswrapper[4792]: I0318 17:15:53.130149 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-sqkxp_71255010-a6ae-4abf-88f1-f6c61c416ca1/marketplace-operator/0.log" Mar 18 17:15:53 crc kubenswrapper[4792]: I0318 17:15:53.282481 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6lntk_1f303ba2-d191-4ad6-a474-de409ea5475b/extract-utilities/0.log" Mar 18 17:15:53 crc kubenswrapper[4792]: I0318 17:15:53.463828 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6lntk_1f303ba2-d191-4ad6-a474-de409ea5475b/extract-utilities/0.log" Mar 18 17:15:53 crc kubenswrapper[4792]: I0318 17:15:53.500205 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6lntk_1f303ba2-d191-4ad6-a474-de409ea5475b/extract-content/0.log" Mar 18 17:15:53 crc kubenswrapper[4792]: I0318 17:15:53.596773 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6lntk_1f303ba2-d191-4ad6-a474-de409ea5475b/extract-content/0.log" Mar 18 17:15:53 crc kubenswrapper[4792]: I0318 17:15:53.846025 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6lntk_1f303ba2-d191-4ad6-a474-de409ea5475b/extract-content/0.log" Mar 18 17:15:53 crc kubenswrapper[4792]: I0318 17:15:53.898534 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6lntk_1f303ba2-d191-4ad6-a474-de409ea5475b/extract-utilities/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.048664 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chj47_143df0e5-40e9-4536-8285-509497426831/extract-utilities/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.168094 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zw8n7_e5ef8d1c-3435-4dcb-8397-2314c8795c3b/registry-server/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.272573 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chj47_143df0e5-40e9-4536-8285-509497426831/extract-content/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.276839 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6lntk_1f303ba2-d191-4ad6-a474-de409ea5475b/registry-server/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.309014 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chj47_143df0e5-40e9-4536-8285-509497426831/extract-content/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.311819 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chj47_143df0e5-40e9-4536-8285-509497426831/extract-utilities/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.495771 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chj47_143df0e5-40e9-4536-8285-509497426831/extract-utilities/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.552561 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chj47_143df0e5-40e9-4536-8285-509497426831/extract-content/0.log" Mar 18 17:15:54 crc kubenswrapper[4792]: I0318 17:15:54.858193 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chj47_143df0e5-40e9-4536-8285-509497426831/registry-server/0.log" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.647201 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qjzd4"] Mar 18 17:15:58 crc kubenswrapper[4792]: E0318 17:15:58.648370 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d4d788-5d4d-468d-b308-b2ce1d936666" containerName="collect-profiles" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.648389 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d4d788-5d4d-468d-b308-b2ce1d936666" containerName="collect-profiles" Mar 18 17:15:58 crc kubenswrapper[4792]: E0318 17:15:58.648442 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="registry-server" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.648450 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="registry-server" Mar 18 17:15:58 crc kubenswrapper[4792]: E0318 17:15:58.648483 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="extract-content" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.648491 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="extract-content" Mar 18 17:15:58 crc kubenswrapper[4792]: E0318 17:15:58.648516 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="extract-utilities" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.648524 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="extract-utilities" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.648799 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d4d788-5d4d-468d-b308-b2ce1d936666" containerName="collect-profiles" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.648831 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e52bd1-4439-45db-a400-05fbb1aa778a" containerName="registry-server" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.651686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.662547 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qjzd4"] Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.819979 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvx9k\" (UniqueName: \"kubernetes.io/projected/ec609206-be1e-4c4a-a0ba-667965ae5064-kube-api-access-xvx9k\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.820563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-utilities\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.820724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-catalog-content\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.922886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvx9k\" (UniqueName: \"kubernetes.io/projected/ec609206-be1e-4c4a-a0ba-667965ae5064-kube-api-access-xvx9k\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.923098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-utilities\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.923167 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-catalog-content\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.923609 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-utilities\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.923721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-catalog-content\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.942161 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvx9k\" (UniqueName: \"kubernetes.io/projected/ec609206-be1e-4c4a-a0ba-667965ae5064-kube-api-access-xvx9k\") pod \"redhat-operators-qjzd4\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:58 crc kubenswrapper[4792]: I0318 17:15:58.980151 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:15:59 crc kubenswrapper[4792]: I0318 17:15:59.703160 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qjzd4"] Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.157890 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564236-jb4sr"] Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.160294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.162899 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.163471 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.167544 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.173784 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-jb4sr"] Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.196614 4792 generic.go:334] "Generic (PLEG): container finished" podID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerID="440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74" exitCode=0 Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.196911 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjzd4" event={"ID":"ec609206-be1e-4c4a-a0ba-667965ae5064","Type":"ContainerDied","Data":"440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74"} Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.197041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjzd4" event={"ID":"ec609206-be1e-4c4a-a0ba-667965ae5064","Type":"ContainerStarted","Data":"da325c17d0ee94bc63d738374062e93d50b0c59fa058289bda5fb7b4e5d12f97"} Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.202652 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.257105 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvp8\" (UniqueName: \"kubernetes.io/projected/c04aa62d-680a-43f2-9505-32ed4c7eef88-kube-api-access-tsvp8\") pod \"auto-csr-approver-29564236-jb4sr\" (UID: \"c04aa62d-680a-43f2-9505-32ed4c7eef88\") " pod="openshift-infra/auto-csr-approver-29564236-jb4sr" Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.359541 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvp8\" (UniqueName: \"kubernetes.io/projected/c04aa62d-680a-43f2-9505-32ed4c7eef88-kube-api-access-tsvp8\") pod \"auto-csr-approver-29564236-jb4sr\" (UID: \"c04aa62d-680a-43f2-9505-32ed4c7eef88\") " pod="openshift-infra/auto-csr-approver-29564236-jb4sr" Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.380338 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvp8\" (UniqueName: \"kubernetes.io/projected/c04aa62d-680a-43f2-9505-32ed4c7eef88-kube-api-access-tsvp8\") pod \"auto-csr-approver-29564236-jb4sr\" (UID: \"c04aa62d-680a-43f2-9505-32ed4c7eef88\") " pod="openshift-infra/auto-csr-approver-29564236-jb4sr" Mar 18 17:16:00 crc kubenswrapper[4792]: I0318 17:16:00.490459 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" Mar 18 17:16:01 crc kubenswrapper[4792]: I0318 17:16:01.005126 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-jb4sr"] Mar 18 17:16:01 crc kubenswrapper[4792]: W0318 17:16:01.007797 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc04aa62d_680a_43f2_9505_32ed4c7eef88.slice/crio-3492a923b738ece964766fafeedc19f1f01a3cbe1db2ea9d0bad37145eaf5cfd WatchSource:0}: Error finding container 3492a923b738ece964766fafeedc19f1f01a3cbe1db2ea9d0bad37145eaf5cfd: Status 404 returned error can't find the container with id 3492a923b738ece964766fafeedc19f1f01a3cbe1db2ea9d0bad37145eaf5cfd Mar 18 17:16:01 crc kubenswrapper[4792]: I0318 17:16:01.211478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" event={"ID":"c04aa62d-680a-43f2-9505-32ed4c7eef88","Type":"ContainerStarted","Data":"3492a923b738ece964766fafeedc19f1f01a3cbe1db2ea9d0bad37145eaf5cfd"} Mar 18 17:16:01 crc kubenswrapper[4792]: I0318 17:16:01.214205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjzd4" event={"ID":"ec609206-be1e-4c4a-a0ba-667965ae5064","Type":"ContainerStarted","Data":"a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be"} Mar 18 17:16:04 crc kubenswrapper[4792]: I0318 17:16:04.389005 4792 scope.go:117] "RemoveContainer" containerID="88ee34d1fbe4b8b44f4ff64f15ba5e687dd7fb58f1c895047874626a2a4ff265" Mar 18 17:16:05 crc kubenswrapper[4792]: I0318 17:16:05.260337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" event={"ID":"c04aa62d-680a-43f2-9505-32ed4c7eef88","Type":"ContainerStarted","Data":"ae647c07870e7b3e0ae7e021d0a11b8e20ed4c75326a4ad6703f31dca415148d"} Mar 18 17:16:05 crc kubenswrapper[4792]: I0318 17:16:05.278825 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" podStartSLOduration=3.39983683 podStartE2EDuration="5.278800018s" podCreationTimestamp="2026-03-18 17:16:00 +0000 UTC" firstStartedPulling="2026-03-18 17:16:01.011078372 +0000 UTC m=+6109.880407309" lastFinishedPulling="2026-03-18 17:16:02.89004156 +0000 UTC m=+6111.759370497" observedRunningTime="2026-03-18 17:16:05.27538926 +0000 UTC m=+6114.144718197" watchObservedRunningTime="2026-03-18 17:16:05.278800018 +0000 UTC m=+6114.148128945" Mar 18 17:16:06 crc kubenswrapper[4792]: I0318 17:16:06.275234 4792 generic.go:334] "Generic (PLEG): container finished" podID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerID="a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be" exitCode=0 Mar 18 17:16:06 crc kubenswrapper[4792]: I0318 17:16:06.275316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjzd4" event={"ID":"ec609206-be1e-4c4a-a0ba-667965ae5064","Type":"ContainerDied","Data":"a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be"} Mar 18 17:16:06 crc kubenswrapper[4792]: I0318 17:16:06.279017 4792 generic.go:334] "Generic (PLEG): container finished" podID="c04aa62d-680a-43f2-9505-32ed4c7eef88" containerID="ae647c07870e7b3e0ae7e021d0a11b8e20ed4c75326a4ad6703f31dca415148d" exitCode=0 Mar 18 17:16:06 crc kubenswrapper[4792]: I0318 17:16:06.279054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" event={"ID":"c04aa62d-680a-43f2-9505-32ed4c7eef88","Type":"ContainerDied","Data":"ae647c07870e7b3e0ae7e021d0a11b8e20ed4c75326a4ad6703f31dca415148d"} Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.299214 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjzd4" event={"ID":"ec609206-be1e-4c4a-a0ba-667965ae5064","Type":"ContainerStarted","Data":"ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d"} Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.321758 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qjzd4" podStartSLOduration=2.786781848 podStartE2EDuration="9.321734855s" podCreationTimestamp="2026-03-18 17:15:58 +0000 UTC" firstStartedPulling="2026-03-18 17:16:00.199418069 +0000 UTC m=+6109.068747006" lastFinishedPulling="2026-03-18 17:16:06.734371076 +0000 UTC m=+6115.603700013" observedRunningTime="2026-03-18 17:16:07.319597367 +0000 UTC m=+6116.188926304" watchObservedRunningTime="2026-03-18 17:16:07.321734855 +0000 UTC m=+6116.191063802" Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.723717 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-845798d5f7-27dml_997642b8-111c-438c-906c-ace1a270f33b/prometheus-operator-admission-webhook/0.log" Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.773500 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-7m5sb_58802970-175f-48a9-aa0b-25cbd849fecf/prometheus-operator/0.log" Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.783076 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-845798d5f7-7ptg8_d350c21d-f3fd-4b9e-a5f2-d7172fb87714/prometheus-operator-admission-webhook/0.log" Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.856319 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.962246 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsvp8\" (UniqueName: \"kubernetes.io/projected/c04aa62d-680a-43f2-9505-32ed4c7eef88-kube-api-access-tsvp8\") pod \"c04aa62d-680a-43f2-9505-32ed4c7eef88\" (UID: \"c04aa62d-680a-43f2-9505-32ed4c7eef88\") " Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.970807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04aa62d-680a-43f2-9505-32ed4c7eef88-kube-api-access-tsvp8" (OuterVolumeSpecName: "kube-api-access-tsvp8") pod "c04aa62d-680a-43f2-9505-32ed4c7eef88" (UID: "c04aa62d-680a-43f2-9505-32ed4c7eef88"). InnerVolumeSpecName "kube-api-access-tsvp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:16:07 crc kubenswrapper[4792]: I0318 17:16:07.981929 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-x5w94_255ea945-6e83-4ead-b609-b47a6b5eaafa/operator/0.log" Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.066145 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsvp8\" (UniqueName: \"kubernetes.io/projected/c04aa62d-680a-43f2-9505-32ed4c7eef88-kube-api-access-tsvp8\") on node \"crc\" DevicePath \"\"" Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.066375 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-ct7p5_675f6ffb-b144-4efc-b47a-81c748cb4765/observability-ui-dashboards/0.log" Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.080159 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-d9577b4dd-zfrmv_15bde542-1ffd-48b4-b2cf-98d98348920e/perses-operator/0.log" Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.311757 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" event={"ID":"c04aa62d-680a-43f2-9505-32ed4c7eef88","Type":"ContainerDied","Data":"3492a923b738ece964766fafeedc19f1f01a3cbe1db2ea9d0bad37145eaf5cfd"} Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.311801 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3492a923b738ece964766fafeedc19f1f01a3cbe1db2ea9d0bad37145eaf5cfd" Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.311901 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564236-jb4sr" Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.359514 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-5qhqg"] Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.372255 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564230-5qhqg"] Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.981190 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:16:08 crc kubenswrapper[4792]: I0318 17:16:08.981713 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:16:09 crc kubenswrapper[4792]: I0318 17:16:09.871460 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd50d1f-7cd9-408e-884b-878dbef6ee28" path="/var/lib/kubelet/pods/acd50d1f-7cd9-408e-884b-878dbef6ee28/volumes" Mar 18 17:16:10 crc kubenswrapper[4792]: I0318 17:16:10.086691 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qjzd4" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" probeResult="failure" output=< Mar 18 17:16:10 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:16:10 crc kubenswrapper[4792]: > Mar 18 17:16:20 crc kubenswrapper[4792]: I0318 17:16:20.025622 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qjzd4" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" probeResult="failure" output=< Mar 18 17:16:20 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:16:20 crc kubenswrapper[4792]: > Mar 18 17:16:20 crc kubenswrapper[4792]: I0318 17:16:20.976844 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c8cdd9f9f-lv5d4_2c438c99-c0c4-43ec-a5e7-33a18425e63f/kube-rbac-proxy/0.log" Mar 18 17:16:21 crc kubenswrapper[4792]: I0318 17:16:21.006078 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c8cdd9f9f-lv5d4_2c438c99-c0c4-43ec-a5e7-33a18425e63f/manager/1.log" Mar 18 17:16:21 crc kubenswrapper[4792]: I0318 17:16:21.050447 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c8cdd9f9f-lv5d4_2c438c99-c0c4-43ec-a5e7-33a18425e63f/manager/0.log" Mar 18 17:16:30 crc kubenswrapper[4792]: I0318 17:16:30.027942 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qjzd4" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" probeResult="failure" output=< Mar 18 17:16:30 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:16:30 crc kubenswrapper[4792]: > Mar 18 17:16:40 crc kubenswrapper[4792]: I0318 17:16:40.039992 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qjzd4" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" probeResult="failure" output=< Mar 18 17:16:40 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:16:40 crc kubenswrapper[4792]: > Mar 18 17:16:50 crc kubenswrapper[4792]: I0318 17:16:50.035403 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qjzd4" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" probeResult="failure" output=< Mar 18 17:16:50 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:16:50 crc kubenswrapper[4792]: > Mar 18 17:16:59 crc kubenswrapper[4792]: I0318 17:16:59.035935 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:16:59 crc kubenswrapper[4792]: I0318 17:16:59.092963 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:16:59 crc kubenswrapper[4792]: I0318 17:16:59.879951 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qjzd4"] Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:00.932765 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qjzd4" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" containerID="cri-o://ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d" gracePeriod=2 Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.943435 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.945653 4792 generic.go:334] "Generic (PLEG): container finished" podID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerID="ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d" exitCode=0 Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.945698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjzd4" event={"ID":"ec609206-be1e-4c4a-a0ba-667965ae5064","Type":"ContainerDied","Data":"ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d"} Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.945731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjzd4" event={"ID":"ec609206-be1e-4c4a-a0ba-667965ae5064","Type":"ContainerDied","Data":"da325c17d0ee94bc63d738374062e93d50b0c59fa058289bda5fb7b4e5d12f97"} Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.945750 4792 scope.go:117] "RemoveContainer" containerID="ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d" Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.991124 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-utilities\") pod \"ec609206-be1e-4c4a-a0ba-667965ae5064\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.991299 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-catalog-content\") pod \"ec609206-be1e-4c4a-a0ba-667965ae5064\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.991461 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvx9k\" (UniqueName: \"kubernetes.io/projected/ec609206-be1e-4c4a-a0ba-667965ae5064-kube-api-access-xvx9k\") pod \"ec609206-be1e-4c4a-a0ba-667965ae5064\" (UID: \"ec609206-be1e-4c4a-a0ba-667965ae5064\") " Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.992193 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-utilities" (OuterVolumeSpecName: "utilities") pod "ec609206-be1e-4c4a-a0ba-667965ae5064" (UID: "ec609206-be1e-4c4a-a0ba-667965ae5064"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.993719 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:01 crc kubenswrapper[4792]: I0318 17:17:01.996598 4792 scope.go:117] "RemoveContainer" containerID="a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.006351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec609206-be1e-4c4a-a0ba-667965ae5064-kube-api-access-xvx9k" (OuterVolumeSpecName: "kube-api-access-xvx9k") pod "ec609206-be1e-4c4a-a0ba-667965ae5064" (UID: "ec609206-be1e-4c4a-a0ba-667965ae5064"). InnerVolumeSpecName "kube-api-access-xvx9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.087724 4792 scope.go:117] "RemoveContainer" containerID="440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.096920 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvx9k\" (UniqueName: \"kubernetes.io/projected/ec609206-be1e-4c4a-a0ba-667965ae5064-kube-api-access-xvx9k\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.162473 4792 scope.go:117] "RemoveContainer" containerID="ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d" Mar 18 17:17:02 crc kubenswrapper[4792]: E0318 17:17:02.164263 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d\": container with ID starting with ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d not found: ID does not exist" containerID="ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.164298 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d"} err="failed to get container status \"ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d\": rpc error: code = NotFound desc = could not find container \"ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d\": container with ID starting with ab4b39a1472c00d60f42e8830615ce3a28d2ab03c43c007df57bea98eb05e03d not found: ID does not exist" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.164324 4792 scope.go:117] "RemoveContainer" containerID="a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be" Mar 18 17:17:02 crc kubenswrapper[4792]: E0318 17:17:02.164795 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be\": container with ID starting with a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be not found: ID does not exist" containerID="a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.165232 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be"} err="failed to get container status \"a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be\": rpc error: code = NotFound desc = could not find container \"a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be\": container with ID starting with a4cda987d32766fb50d91a31547f9a62562cb70f46ba7c0693c12b49c97784be not found: ID does not exist" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.165250 4792 scope.go:117] "RemoveContainer" containerID="440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74" Mar 18 17:17:02 crc kubenswrapper[4792]: E0318 17:17:02.165463 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74\": container with ID starting with 440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74 not found: ID does not exist" containerID="440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.165477 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74"} err="failed to get container status \"440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74\": rpc error: code = NotFound desc = could not find container \"440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74\": container with ID starting with 440be79f3796d0941d31f287d395fe4d485c69d7dd786ba5f8eda71e162abe74 not found: ID does not exist" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.184734 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec609206-be1e-4c4a-a0ba-667965ae5064" (UID: "ec609206-be1e-4c4a-a0ba-667965ae5064"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.199364 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec609206-be1e-4c4a-a0ba-667965ae5064-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:17:02 crc kubenswrapper[4792]: I0318 17:17:02.966115 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjzd4" Mar 18 17:17:03 crc kubenswrapper[4792]: I0318 17:17:03.015871 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qjzd4"] Mar 18 17:17:03 crc kubenswrapper[4792]: I0318 17:17:03.046839 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qjzd4"] Mar 18 17:17:03 crc kubenswrapper[4792]: I0318 17:17:03.871627 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" path="/var/lib/kubelet/pods/ec609206-be1e-4c4a-a0ba-667965ae5064/volumes" Mar 18 17:17:04 crc kubenswrapper[4792]: I0318 17:17:04.564102 4792 scope.go:117] "RemoveContainer" containerID="cd12f958b06bc3c5fa631bd6214a7cd949d72a3561a55d25f668f1855d25fb21" Mar 18 17:17:30 crc kubenswrapper[4792]: I0318 17:17:30.322203 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:17:30 crc kubenswrapper[4792]: I0318 17:17:30.322809 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.162825 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564238-fvpxp"] Mar 18 17:18:00 crc kubenswrapper[4792]: E0318 17:18:00.163954 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="extract-utilities" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.163988 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="extract-utilities" Mar 18 17:18:00 crc kubenswrapper[4792]: E0318 17:18:00.164028 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04aa62d-680a-43f2-9505-32ed4c7eef88" containerName="oc" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.164034 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04aa62d-680a-43f2-9505-32ed4c7eef88" containerName="oc" Mar 18 17:18:00 crc kubenswrapper[4792]: E0318 17:18:00.164055 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="extract-content" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.164061 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="extract-content" Mar 18 17:18:00 crc kubenswrapper[4792]: E0318 17:18:00.164072 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.164078 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.164349 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04aa62d-680a-43f2-9505-32ed4c7eef88" containerName="oc" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.164371 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec609206-be1e-4c4a-a0ba-667965ae5064" containerName="registry-server" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.165309 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.167772 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.168287 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.171433 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.176416 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-fvpxp"] Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.213325 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wfmd\" (UniqueName: \"kubernetes.io/projected/acbcbcda-17c8-448b-a8a0-272d2b92682e-kube-api-access-8wfmd\") pod \"auto-csr-approver-29564238-fvpxp\" (UID: \"acbcbcda-17c8-448b-a8a0-272d2b92682e\") " pod="openshift-infra/auto-csr-approver-29564238-fvpxp" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.315956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wfmd\" (UniqueName: \"kubernetes.io/projected/acbcbcda-17c8-448b-a8a0-272d2b92682e-kube-api-access-8wfmd\") pod \"auto-csr-approver-29564238-fvpxp\" (UID: \"acbcbcda-17c8-448b-a8a0-272d2b92682e\") " pod="openshift-infra/auto-csr-approver-29564238-fvpxp" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.321986 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.322055 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.352791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wfmd\" (UniqueName: \"kubernetes.io/projected/acbcbcda-17c8-448b-a8a0-272d2b92682e-kube-api-access-8wfmd\") pod \"auto-csr-approver-29564238-fvpxp\" (UID: \"acbcbcda-17c8-448b-a8a0-272d2b92682e\") " pod="openshift-infra/auto-csr-approver-29564238-fvpxp" Mar 18 17:18:00 crc kubenswrapper[4792]: I0318 17:18:00.492809 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" Mar 18 17:18:01 crc kubenswrapper[4792]: I0318 17:18:01.936709 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-fvpxp"] Mar 18 17:18:02 crc kubenswrapper[4792]: I0318 17:18:02.682801 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" event={"ID":"acbcbcda-17c8-448b-a8a0-272d2b92682e","Type":"ContainerStarted","Data":"7b670d7cadc6b2d26b14335af5d2cfa76749e23359cb248e330917d2f61301b7"} Mar 18 17:18:04 crc kubenswrapper[4792]: I0318 17:18:04.704155 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" event={"ID":"acbcbcda-17c8-448b-a8a0-272d2b92682e","Type":"ContainerStarted","Data":"af7012d330dbd00f1b2b1e5892a6dce35ba81054d171d63379712bee687a9a9b"} Mar 18 17:18:04 crc kubenswrapper[4792]: I0318 17:18:04.731794 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" podStartSLOduration=3.662061908 podStartE2EDuration="4.73177321s" podCreationTimestamp="2026-03-18 17:18:00 +0000 UTC" firstStartedPulling="2026-03-18 17:18:01.950547803 +0000 UTC m=+6230.819876740" lastFinishedPulling="2026-03-18 17:18:03.020259105 +0000 UTC m=+6231.889588042" observedRunningTime="2026-03-18 17:18:04.719029098 +0000 UTC m=+6233.588358035" watchObservedRunningTime="2026-03-18 17:18:04.73177321 +0000 UTC m=+6233.601102147" Mar 18 17:18:04 crc kubenswrapper[4792]: I0318 17:18:04.751958 4792 scope.go:117] "RemoveContainer" containerID="aa150696d06100318360a05e099b07d133b4e3ffac9b066bda3f651d10330dbb" Mar 18 17:18:05 crc kubenswrapper[4792]: I0318 17:18:05.732132 4792 generic.go:334] "Generic (PLEG): container finished" podID="acbcbcda-17c8-448b-a8a0-272d2b92682e" containerID="af7012d330dbd00f1b2b1e5892a6dce35ba81054d171d63379712bee687a9a9b" exitCode=0 Mar 18 17:18:05 crc kubenswrapper[4792]: I0318 17:18:05.732376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" event={"ID":"acbcbcda-17c8-448b-a8a0-272d2b92682e","Type":"ContainerDied","Data":"af7012d330dbd00f1b2b1e5892a6dce35ba81054d171d63379712bee687a9a9b"} Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.155559 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.332835 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wfmd\" (UniqueName: \"kubernetes.io/projected/acbcbcda-17c8-448b-a8a0-272d2b92682e-kube-api-access-8wfmd\") pod \"acbcbcda-17c8-448b-a8a0-272d2b92682e\" (UID: \"acbcbcda-17c8-448b-a8a0-272d2b92682e\") " Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.351332 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbcbcda-17c8-448b-a8a0-272d2b92682e-kube-api-access-8wfmd" (OuterVolumeSpecName: "kube-api-access-8wfmd") pod "acbcbcda-17c8-448b-a8a0-272d2b92682e" (UID: "acbcbcda-17c8-448b-a8a0-272d2b92682e"). InnerVolumeSpecName "kube-api-access-8wfmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.435694 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wfmd\" (UniqueName: \"kubernetes.io/projected/acbcbcda-17c8-448b-a8a0-272d2b92682e-kube-api-access-8wfmd\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.757711 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" event={"ID":"acbcbcda-17c8-448b-a8a0-272d2b92682e","Type":"ContainerDied","Data":"7b670d7cadc6b2d26b14335af5d2cfa76749e23359cb248e330917d2f61301b7"} Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.757760 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b670d7cadc6b2d26b14335af5d2cfa76749e23359cb248e330917d2f61301b7" Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.757781 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564238-fvpxp" Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.801468 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-4nhvl"] Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.813615 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564232-4nhvl"] Mar 18 17:18:07 crc kubenswrapper[4792]: I0318 17:18:07.866896 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913dd3c9-fc8b-499e-846e-4c9380c3df75" path="/var/lib/kubelet/pods/913dd3c9-fc8b-499e-846e-4c9380c3df75/volumes" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.217723 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8hgnj"] Mar 18 17:18:22 crc kubenswrapper[4792]: E0318 17:18:22.218800 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbcbcda-17c8-448b-a8a0-272d2b92682e" containerName="oc" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.218816 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbcbcda-17c8-448b-a8a0-272d2b92682e" containerName="oc" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.219129 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbcbcda-17c8-448b-a8a0-272d2b92682e" containerName="oc" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.221133 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.254235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hgnj"] Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.372939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-catalog-content\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.373017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcd78\" (UniqueName: \"kubernetes.io/projected/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-kube-api-access-wcd78\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.373051 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-utilities\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.475466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-catalog-content\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.475544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd78\" (UniqueName: \"kubernetes.io/projected/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-kube-api-access-wcd78\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.475584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-utilities\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.475928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-catalog-content\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.476008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-utilities\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.498608 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcd78\" (UniqueName: \"kubernetes.io/projected/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-kube-api-access-wcd78\") pod \"community-operators-8hgnj\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:22 crc kubenswrapper[4792]: I0318 17:18:22.549961 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:23 crc kubenswrapper[4792]: I0318 17:18:23.048559 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hgnj"] Mar 18 17:18:23 crc kubenswrapper[4792]: I0318 17:18:23.967733 4792 generic.go:334] "Generic (PLEG): container finished" podID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerID="d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252" exitCode=0 Mar 18 17:18:23 crc kubenswrapper[4792]: I0318 17:18:23.969813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hgnj" event={"ID":"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87","Type":"ContainerDied","Data":"d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252"} Mar 18 17:18:23 crc kubenswrapper[4792]: I0318 17:18:23.969924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hgnj" event={"ID":"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87","Type":"ContainerStarted","Data":"8e3297973c0e83329a37746e181209d465de2d74a7908430f50cfe0ca5d64e34"} Mar 18 17:18:24 crc kubenswrapper[4792]: I0318 17:18:24.980554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hgnj" event={"ID":"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87","Type":"ContainerStarted","Data":"b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b"} Mar 18 17:18:27 crc kubenswrapper[4792]: I0318 17:18:27.002251 4792 generic.go:334] "Generic (PLEG): container finished" podID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerID="b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b" exitCode=0 Mar 18 17:18:27 crc kubenswrapper[4792]: I0318 17:18:27.002341 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hgnj" event={"ID":"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87","Type":"ContainerDied","Data":"b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b"} Mar 18 17:18:28 crc kubenswrapper[4792]: I0318 17:18:28.016989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hgnj" event={"ID":"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87","Type":"ContainerStarted","Data":"0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5"} Mar 18 17:18:28 crc kubenswrapper[4792]: I0318 17:18:28.048854 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8hgnj" podStartSLOduration=2.321440577 podStartE2EDuration="6.048832751s" podCreationTimestamp="2026-03-18 17:18:22 +0000 UTC" firstStartedPulling="2026-03-18 17:18:23.973768577 +0000 UTC m=+6252.843097514" lastFinishedPulling="2026-03-18 17:18:27.701160751 +0000 UTC m=+6256.570489688" observedRunningTime="2026-03-18 17:18:28.041754459 +0000 UTC m=+6256.911083416" watchObservedRunningTime="2026-03-18 17:18:28.048832751 +0000 UTC m=+6256.918161688" Mar 18 17:18:30 crc kubenswrapper[4792]: I0318 17:18:30.322156 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:18:30 crc kubenswrapper[4792]: I0318 17:18:30.322774 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:18:30 crc kubenswrapper[4792]: I0318 17:18:30.322834 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 17:18:30 crc kubenswrapper[4792]: I0318 17:18:30.325461 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74500b43621b4500ad54de67c8edb80af58c911e029926923b3f0201ec20df9a"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:18:30 crc kubenswrapper[4792]: I0318 17:18:30.325531 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://74500b43621b4500ad54de67c8edb80af58c911e029926923b3f0201ec20df9a" gracePeriod=600 Mar 18 17:18:31 crc kubenswrapper[4792]: I0318 17:18:31.053244 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="74500b43621b4500ad54de67c8edb80af58c911e029926923b3f0201ec20df9a" exitCode=0 Mar 18 17:18:31 crc kubenswrapper[4792]: I0318 17:18:31.054045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"74500b43621b4500ad54de67c8edb80af58c911e029926923b3f0201ec20df9a"} Mar 18 17:18:31 crc kubenswrapper[4792]: I0318 17:18:31.054101 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerStarted","Data":"cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585"} Mar 18 17:18:31 crc kubenswrapper[4792]: I0318 17:18:31.054130 4792 scope.go:117] "RemoveContainer" containerID="4ca223e7fd77b004e1d99d3a3d736ae45f62d1df62252c1659418907963fbef8" Mar 18 17:18:32 crc kubenswrapper[4792]: I0318 17:18:32.551146 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:32 crc kubenswrapper[4792]: I0318 17:18:32.551752 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:32 crc kubenswrapper[4792]: I0318 17:18:32.612045 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:33 crc kubenswrapper[4792]: I0318 17:18:33.131723 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:33 crc kubenswrapper[4792]: I0318 17:18:33.178818 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hgnj"] Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.104047 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8hgnj" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="registry-server" containerID="cri-o://0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5" gracePeriod=2 Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.681668 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.855432 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcd78\" (UniqueName: \"kubernetes.io/projected/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-kube-api-access-wcd78\") pod \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.855782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-catalog-content\") pod \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.855909 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-utilities\") pod \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\" (UID: \"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87\") " Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.856607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-utilities" (OuterVolumeSpecName: "utilities") pod "0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" (UID: "0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.863255 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-kube-api-access-wcd78" (OuterVolumeSpecName: "kube-api-access-wcd78") pod "0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" (UID: "0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87"). InnerVolumeSpecName "kube-api-access-wcd78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.910045 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" (UID: "0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.959481 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcd78\" (UniqueName: \"kubernetes.io/projected/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-kube-api-access-wcd78\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.959517 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:35 crc kubenswrapper[4792]: I0318 17:18:35.959528 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.117256 4792 generic.go:334] "Generic (PLEG): container finished" podID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerID="0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5" exitCode=0 Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.117314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hgnj" event={"ID":"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87","Type":"ContainerDied","Data":"0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5"} Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.117355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hgnj" event={"ID":"0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87","Type":"ContainerDied","Data":"8e3297973c0e83329a37746e181209d465de2d74a7908430f50cfe0ca5d64e34"} Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.117382 4792 scope.go:117] "RemoveContainer" containerID="0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.119303 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hgnj" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.145850 4792 scope.go:117] "RemoveContainer" containerID="b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.172316 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hgnj"] Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.179243 4792 scope.go:117] "RemoveContainer" containerID="d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.198839 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8hgnj"] Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.285151 4792 scope.go:117] "RemoveContainer" containerID="0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5" Mar 18 17:18:36 crc kubenswrapper[4792]: E0318 17:18:36.286151 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5\": container with ID starting with 0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5 not found: ID does not exist" containerID="0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.286198 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5"} err="failed to get container status \"0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5\": rpc error: code = NotFound desc = could not find container \"0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5\": container with ID starting with 0b8e798f7813d9829920b4a12805e795eb9a0f14e343a9ee4a51ab2acfdb3ab5 not found: ID does not exist" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.286227 4792 scope.go:117] "RemoveContainer" containerID="b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b" Mar 18 17:18:36 crc kubenswrapper[4792]: E0318 17:18:36.289353 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b\": container with ID starting with b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b not found: ID does not exist" containerID="b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.289385 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b"} err="failed to get container status \"b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b\": rpc error: code = NotFound desc = could not find container \"b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b\": container with ID starting with b5b302b4d8d8b59756c726cd6bb9df74b4a57490aa962c874f9dcfe06e18e56b not found: ID does not exist" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.289411 4792 scope.go:117] "RemoveContainer" containerID="d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252" Mar 18 17:18:36 crc kubenswrapper[4792]: E0318 17:18:36.301269 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252\": container with ID starting with d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252 not found: ID does not exist" containerID="d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252" Mar 18 17:18:36 crc kubenswrapper[4792]: I0318 17:18:36.301331 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252"} err="failed to get container status \"d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252\": rpc error: code = NotFound desc = could not find container \"d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252\": container with ID starting with d5139af4d7ff370c85f550499b59437e7d3a6bb9fd0d0b60fa3e7be862aa1252 not found: ID does not exist" Mar 18 17:18:37 crc kubenswrapper[4792]: I0318 17:18:37.868680 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" path="/var/lib/kubelet/pods/0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87/volumes" Mar 18 17:18:40 crc kubenswrapper[4792]: I0318 17:18:40.166891 4792 generic.go:334] "Generic (PLEG): container finished" podID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerID="747e41d0837bba9406f2e4d6962d91726b330295636a21c4a6aad3dfc15f2ec6" exitCode=0 Mar 18 17:18:40 crc kubenswrapper[4792]: I0318 17:18:40.166985 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jgh96/must-gather-c52ck" event={"ID":"c30591cd-5659-4403-b080-ace5f1d6d48f","Type":"ContainerDied","Data":"747e41d0837bba9406f2e4d6962d91726b330295636a21c4a6aad3dfc15f2ec6"} Mar 18 17:18:40 crc kubenswrapper[4792]: I0318 17:18:40.169259 4792 scope.go:117] "RemoveContainer" containerID="747e41d0837bba9406f2e4d6962d91726b330295636a21c4a6aad3dfc15f2ec6" Mar 18 17:18:41 crc kubenswrapper[4792]: I0318 17:18:41.079462 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jgh96_must-gather-c52ck_c30591cd-5659-4403-b080-ace5f1d6d48f/gather/0.log" Mar 18 17:18:49 crc kubenswrapper[4792]: I0318 17:18:49.830793 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jgh96/must-gather-c52ck"] Mar 18 17:18:49 crc kubenswrapper[4792]: I0318 17:18:49.832913 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jgh96/must-gather-c52ck" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerName="copy" containerID="cri-o://6b5b187f7dbd99c72c88f7946b52817a171be0bafa419a17efa41f0cfabbbcac" gracePeriod=2 Mar 18 17:18:49 crc kubenswrapper[4792]: I0318 17:18:49.846506 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jgh96/must-gather-c52ck"] Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.278808 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jgh96_must-gather-c52ck_c30591cd-5659-4403-b080-ace5f1d6d48f/copy/0.log" Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.279743 4792 generic.go:334] "Generic (PLEG): container finished" podID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerID="6b5b187f7dbd99c72c88f7946b52817a171be0bafa419a17efa41f0cfabbbcac" exitCode=143 Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.279821 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b650037fc6b4cbe86559d966d683e1471ff6ce2f09eda15b0947936b0436cd" Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.349989 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jgh96_must-gather-c52ck_c30591cd-5659-4403-b080-ace5f1d6d48f/copy/0.log" Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.350593 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.368795 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q8lk\" (UniqueName: \"kubernetes.io/projected/c30591cd-5659-4403-b080-ace5f1d6d48f-kube-api-access-8q8lk\") pod \"c30591cd-5659-4403-b080-ace5f1d6d48f\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.368924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c30591cd-5659-4403-b080-ace5f1d6d48f-must-gather-output\") pod \"c30591cd-5659-4403-b080-ace5f1d6d48f\" (UID: \"c30591cd-5659-4403-b080-ace5f1d6d48f\") " Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.377085 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30591cd-5659-4403-b080-ace5f1d6d48f-kube-api-access-8q8lk" (OuterVolumeSpecName: "kube-api-access-8q8lk") pod "c30591cd-5659-4403-b080-ace5f1d6d48f" (UID: "c30591cd-5659-4403-b080-ace5f1d6d48f"). InnerVolumeSpecName "kube-api-access-8q8lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.472380 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q8lk\" (UniqueName: \"kubernetes.io/projected/c30591cd-5659-4403-b080-ace5f1d6d48f-kube-api-access-8q8lk\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.567267 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30591cd-5659-4403-b080-ace5f1d6d48f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c30591cd-5659-4403-b080-ace5f1d6d48f" (UID: "c30591cd-5659-4403-b080-ace5f1d6d48f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:18:50 crc kubenswrapper[4792]: I0318 17:18:50.575491 4792 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c30591cd-5659-4403-b080-ace5f1d6d48f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 17:18:51 crc kubenswrapper[4792]: I0318 17:18:51.289734 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jgh96/must-gather-c52ck" Mar 18 17:18:51 crc kubenswrapper[4792]: I0318 17:18:51.867025 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" path="/var/lib/kubelet/pods/c30591cd-5659-4403-b080-ace5f1d6d48f/volumes" Mar 18 17:19:04 crc kubenswrapper[4792]: I0318 17:19:04.823720 4792 scope.go:117] "RemoveContainer" containerID="bcabf6e9ea4d56a82cc0a34fd9b5fdbd2ccc8e832a44559a0915925d47aabdda" Mar 18 17:19:04 crc kubenswrapper[4792]: I0318 17:19:04.859951 4792 scope.go:117] "RemoveContainer" containerID="9dc6affc505286abc993646078ab80eec3c2e84f6160e493670e61f59e9b71a6" Mar 18 17:19:04 crc kubenswrapper[4792]: I0318 17:19:04.946720 4792 scope.go:117] "RemoveContainer" containerID="747e41d0837bba9406f2e4d6962d91726b330295636a21c4a6aad3dfc15f2ec6" Mar 18 17:19:05 crc kubenswrapper[4792]: I0318 17:19:05.004467 4792 scope.go:117] "RemoveContainer" containerID="6b5b187f7dbd99c72c88f7946b52817a171be0bafa419a17efa41f0cfabbbcac" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.777851 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-87h8s"] Mar 18 17:19:17 crc kubenswrapper[4792]: E0318 17:19:17.779117 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="registry-server" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779136 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="registry-server" Mar 18 17:19:17 crc kubenswrapper[4792]: E0318 17:19:17.779161 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="extract-content" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779169 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="extract-content" Mar 18 17:19:17 crc kubenswrapper[4792]: E0318 17:19:17.779179 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="extract-utilities" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779188 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="extract-utilities" Mar 18 17:19:17 crc kubenswrapper[4792]: E0318 17:19:17.779237 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerName="gather" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779246 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerName="gather" Mar 18 17:19:17 crc kubenswrapper[4792]: E0318 17:19:17.779263 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerName="copy" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779271 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerName="copy" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779563 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerName="gather" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779600 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30591cd-5659-4403-b080-ace5f1d6d48f" containerName="copy" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.779627 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3b2653-bfcd-47d7-9e0c-0cfe8273bb87" containerName="registry-server" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.782040 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.810887 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-87h8s"] Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.973651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-catalog-content\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.974017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9jt4\" (UniqueName: \"kubernetes.io/projected/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-kube-api-access-b9jt4\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:17 crc kubenswrapper[4792]: I0318 17:19:17.974063 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-utilities\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.077934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-catalog-content\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.078009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9jt4\" (UniqueName: \"kubernetes.io/projected/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-kube-api-access-b9jt4\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.078047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-utilities\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.078787 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-catalog-content\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.078928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-utilities\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.098669 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9jt4\" (UniqueName: \"kubernetes.io/projected/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-kube-api-access-b9jt4\") pod \"certified-operators-87h8s\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.106686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.589230 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-87h8s"] Mar 18 17:19:18 crc kubenswrapper[4792]: I0318 17:19:18.613686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87h8s" event={"ID":"2a2ea67a-73cc-4de5-9cda-01961ac38fa8","Type":"ContainerStarted","Data":"51a01be2c26f7c140a915153554a6dc5213ab8be9009c6f5b583c0588c8af4ec"} Mar 18 17:19:19 crc kubenswrapper[4792]: I0318 17:19:19.628547 4792 generic.go:334] "Generic (PLEG): container finished" podID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerID="d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9" exitCode=0 Mar 18 17:19:19 crc kubenswrapper[4792]: I0318 17:19:19.628590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87h8s" event={"ID":"2a2ea67a-73cc-4de5-9cda-01961ac38fa8","Type":"ContainerDied","Data":"d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9"} Mar 18 17:19:20 crc kubenswrapper[4792]: I0318 17:19:20.646945 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87h8s" event={"ID":"2a2ea67a-73cc-4de5-9cda-01961ac38fa8","Type":"ContainerStarted","Data":"2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c"} Mar 18 17:19:22 crc kubenswrapper[4792]: I0318 17:19:22.676997 4792 generic.go:334] "Generic (PLEG): container finished" podID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerID="2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c" exitCode=0 Mar 18 17:19:22 crc kubenswrapper[4792]: I0318 17:19:22.677096 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87h8s" event={"ID":"2a2ea67a-73cc-4de5-9cda-01961ac38fa8","Type":"ContainerDied","Data":"2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c"} Mar 18 17:19:23 crc kubenswrapper[4792]: I0318 17:19:23.695548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87h8s" event={"ID":"2a2ea67a-73cc-4de5-9cda-01961ac38fa8","Type":"ContainerStarted","Data":"2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a"} Mar 18 17:19:23 crc kubenswrapper[4792]: I0318 17:19:23.731449 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-87h8s" podStartSLOduration=2.9767871120000002 podStartE2EDuration="6.731430286s" podCreationTimestamp="2026-03-18 17:19:17 +0000 UTC" firstStartedPulling="2026-03-18 17:19:19.631036503 +0000 UTC m=+6308.500365440" lastFinishedPulling="2026-03-18 17:19:23.385679667 +0000 UTC m=+6312.255008614" observedRunningTime="2026-03-18 17:19:23.714343608 +0000 UTC m=+6312.583672555" watchObservedRunningTime="2026-03-18 17:19:23.731430286 +0000 UTC m=+6312.600759223" Mar 18 17:19:28 crc kubenswrapper[4792]: I0318 17:19:28.108009 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:28 crc kubenswrapper[4792]: I0318 17:19:28.108413 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:29 crc kubenswrapper[4792]: I0318 17:19:29.165246 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-87h8s" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="registry-server" probeResult="failure" output=< Mar 18 17:19:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 18 17:19:29 crc kubenswrapper[4792]: > Mar 18 17:19:38 crc kubenswrapper[4792]: I0318 17:19:38.157454 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:38 crc kubenswrapper[4792]: I0318 17:19:38.214593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:38 crc kubenswrapper[4792]: I0318 17:19:38.397450 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-87h8s"] Mar 18 17:19:39 crc kubenswrapper[4792]: I0318 17:19:39.901675 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-87h8s" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="registry-server" containerID="cri-o://2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a" gracePeriod=2 Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.531557 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.659285 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-catalog-content\") pod \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.659374 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-utilities\") pod \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.659497 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9jt4\" (UniqueName: \"kubernetes.io/projected/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-kube-api-access-b9jt4\") pod \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\" (UID: \"2a2ea67a-73cc-4de5-9cda-01961ac38fa8\") " Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.664995 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-utilities" (OuterVolumeSpecName: "utilities") pod "2a2ea67a-73cc-4de5-9cda-01961ac38fa8" (UID: "2a2ea67a-73cc-4de5-9cda-01961ac38fa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.710342 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-kube-api-access-b9jt4" (OuterVolumeSpecName: "kube-api-access-b9jt4") pod "2a2ea67a-73cc-4de5-9cda-01961ac38fa8" (UID: "2a2ea67a-73cc-4de5-9cda-01961ac38fa8"). InnerVolumeSpecName "kube-api-access-b9jt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.768240 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9jt4\" (UniqueName: \"kubernetes.io/projected/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-kube-api-access-b9jt4\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.768278 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.773534 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a2ea67a-73cc-4de5-9cda-01961ac38fa8" (UID: "2a2ea67a-73cc-4de5-9cda-01961ac38fa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.870734 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2ea67a-73cc-4de5-9cda-01961ac38fa8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.914180 4792 generic.go:334] "Generic (PLEG): container finished" podID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerID="2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a" exitCode=0 Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.914238 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87h8s" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.914243 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87h8s" event={"ID":"2a2ea67a-73cc-4de5-9cda-01961ac38fa8","Type":"ContainerDied","Data":"2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a"} Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.914303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87h8s" event={"ID":"2a2ea67a-73cc-4de5-9cda-01961ac38fa8","Type":"ContainerDied","Data":"51a01be2c26f7c140a915153554a6dc5213ab8be9009c6f5b583c0588c8af4ec"} Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.914335 4792 scope.go:117] "RemoveContainer" containerID="2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.939661 4792 scope.go:117] "RemoveContainer" containerID="2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c" Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.960232 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-87h8s"] Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.975920 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-87h8s"] Mar 18 17:19:40 crc kubenswrapper[4792]: I0318 17:19:40.982316 4792 scope.go:117] "RemoveContainer" containerID="d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9" Mar 18 17:19:41 crc kubenswrapper[4792]: I0318 17:19:41.030012 4792 scope.go:117] "RemoveContainer" containerID="2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a" Mar 18 17:19:41 crc kubenswrapper[4792]: E0318 17:19:41.030729 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a\": container with ID starting with 2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a not found: ID does not exist" containerID="2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a" Mar 18 17:19:41 crc kubenswrapper[4792]: I0318 17:19:41.030786 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a"} err="failed to get container status \"2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a\": rpc error: code = NotFound desc = could not find container \"2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a\": container with ID starting with 2d7901c81b2754e651a3e1e2ab355da4580bd9585ff2d68cae97dfb913ad2e2a not found: ID does not exist" Mar 18 17:19:41 crc kubenswrapper[4792]: I0318 17:19:41.030827 4792 scope.go:117] "RemoveContainer" containerID="2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c" Mar 18 17:19:41 crc kubenswrapper[4792]: E0318 17:19:41.031356 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c\": container with ID starting with 2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c not found: ID does not exist" containerID="2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c" Mar 18 17:19:41 crc kubenswrapper[4792]: I0318 17:19:41.031392 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c"} err="failed to get container status \"2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c\": rpc error: code = NotFound desc = could not find container \"2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c\": container with ID starting with 2f4874edf1f2ce47a7a182cb24835c39ba4f4adad282989edb8408485b69249c not found: ID does not exist" Mar 18 17:19:41 crc kubenswrapper[4792]: I0318 17:19:41.031415 4792 scope.go:117] "RemoveContainer" containerID="d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9" Mar 18 17:19:41 crc kubenswrapper[4792]: E0318 17:19:41.031854 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9\": container with ID starting with d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9 not found: ID does not exist" containerID="d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9" Mar 18 17:19:41 crc kubenswrapper[4792]: I0318 17:19:41.031876 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9"} err="failed to get container status \"d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9\": rpc error: code = NotFound desc = could not find container \"d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9\": container with ID starting with d67b6c7484465231b7044edfdf39b07311e349091f36763b43e7f338eb01f3c9 not found: ID does not exist" Mar 18 17:19:41 crc kubenswrapper[4792]: I0318 17:19:41.867453 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" path="/var/lib/kubelet/pods/2a2ea67a-73cc-4de5-9cda-01961ac38fa8/volumes" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.157739 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564240-825z7"] Mar 18 17:20:00 crc kubenswrapper[4792]: E0318 17:20:00.158909 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="registry-server" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.158925 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="registry-server" Mar 18 17:20:00 crc kubenswrapper[4792]: E0318 17:20:00.158952 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="extract-utilities" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.158960 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="extract-utilities" Mar 18 17:20:00 crc kubenswrapper[4792]: E0318 17:20:00.159007 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="extract-content" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.159018 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="extract-content" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.159286 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a2ea67a-73cc-4de5-9cda-01961ac38fa8" containerName="registry-server" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.160157 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-825z7" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.162178 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.162690 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.163311 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.172547 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564240-825z7"] Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.257641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k87n\" (UniqueName: \"kubernetes.io/projected/51c1b08f-b0d7-4f6e-b9db-07ce63e86300-kube-api-access-4k87n\") pod \"auto-csr-approver-29564240-825z7\" (UID: \"51c1b08f-b0d7-4f6e-b9db-07ce63e86300\") " pod="openshift-infra/auto-csr-approver-29564240-825z7" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.359996 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k87n\" (UniqueName: \"kubernetes.io/projected/51c1b08f-b0d7-4f6e-b9db-07ce63e86300-kube-api-access-4k87n\") pod \"auto-csr-approver-29564240-825z7\" (UID: \"51c1b08f-b0d7-4f6e-b9db-07ce63e86300\") " pod="openshift-infra/auto-csr-approver-29564240-825z7" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.383678 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k87n\" (UniqueName: \"kubernetes.io/projected/51c1b08f-b0d7-4f6e-b9db-07ce63e86300-kube-api-access-4k87n\") pod \"auto-csr-approver-29564240-825z7\" (UID: \"51c1b08f-b0d7-4f6e-b9db-07ce63e86300\") " pod="openshift-infra/auto-csr-approver-29564240-825z7" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.485604 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-825z7" Mar 18 17:20:00 crc kubenswrapper[4792]: I0318 17:20:00.936296 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564240-825z7"] Mar 18 17:20:01 crc kubenswrapper[4792]: I0318 17:20:01.328448 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564240-825z7" event={"ID":"51c1b08f-b0d7-4f6e-b9db-07ce63e86300","Type":"ContainerStarted","Data":"583f6a09475755ec43b7dcbf4d9f87cc1f01322d9a37833d5bae96a77fa80a0d"} Mar 18 17:20:03 crc kubenswrapper[4792]: I0318 17:20:03.355848 4792 generic.go:334] "Generic (PLEG): container finished" podID="51c1b08f-b0d7-4f6e-b9db-07ce63e86300" containerID="c3382d88a1ee06cd22201a47a75943134eafe4126bba6bdac47e6529f2b2a0fd" exitCode=0 Mar 18 17:20:03 crc kubenswrapper[4792]: I0318 17:20:03.355929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564240-825z7" event={"ID":"51c1b08f-b0d7-4f6e-b9db-07ce63e86300","Type":"ContainerDied","Data":"c3382d88a1ee06cd22201a47a75943134eafe4126bba6bdac47e6529f2b2a0fd"} Mar 18 17:20:04 crc kubenswrapper[4792]: I0318 17:20:04.789496 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-825z7" Mar 18 17:20:04 crc kubenswrapper[4792]: I0318 17:20:04.893564 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k87n\" (UniqueName: \"kubernetes.io/projected/51c1b08f-b0d7-4f6e-b9db-07ce63e86300-kube-api-access-4k87n\") pod \"51c1b08f-b0d7-4f6e-b9db-07ce63e86300\" (UID: \"51c1b08f-b0d7-4f6e-b9db-07ce63e86300\") " Mar 18 17:20:04 crc kubenswrapper[4792]: I0318 17:20:04.901686 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c1b08f-b0d7-4f6e-b9db-07ce63e86300-kube-api-access-4k87n" (OuterVolumeSpecName: "kube-api-access-4k87n") pod "51c1b08f-b0d7-4f6e-b9db-07ce63e86300" (UID: "51c1b08f-b0d7-4f6e-b9db-07ce63e86300"). InnerVolumeSpecName "kube-api-access-4k87n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:20:04 crc kubenswrapper[4792]: I0318 17:20:04.997015 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k87n\" (UniqueName: \"kubernetes.io/projected/51c1b08f-b0d7-4f6e-b9db-07ce63e86300-kube-api-access-4k87n\") on node \"crc\" DevicePath \"\"" Mar 18 17:20:05 crc kubenswrapper[4792]: I0318 17:20:05.393900 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564240-825z7" event={"ID":"51c1b08f-b0d7-4f6e-b9db-07ce63e86300","Type":"ContainerDied","Data":"583f6a09475755ec43b7dcbf4d9f87cc1f01322d9a37833d5bae96a77fa80a0d"} Mar 18 17:20:05 crc kubenswrapper[4792]: I0318 17:20:05.393951 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="583f6a09475755ec43b7dcbf4d9f87cc1f01322d9a37833d5bae96a77fa80a0d" Mar 18 17:20:05 crc kubenswrapper[4792]: I0318 17:20:05.394066 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564240-825z7" Mar 18 17:20:05 crc kubenswrapper[4792]: I0318 17:20:05.873318 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-d7r6l"] Mar 18 17:20:05 crc kubenswrapper[4792]: I0318 17:20:05.884236 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564234-d7r6l"] Mar 18 17:20:07 crc kubenswrapper[4792]: I0318 17:20:07.871572 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d" path="/var/lib/kubelet/pods/65f5e3f7-9cb0-45f6-a8de-2d0245d14d2d/volumes" Mar 18 17:20:30 crc kubenswrapper[4792]: I0318 17:20:30.322383 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:20:30 crc kubenswrapper[4792]: I0318 17:20:30.322826 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:21:00 crc kubenswrapper[4792]: I0318 17:21:00.322423 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:21:00 crc kubenswrapper[4792]: I0318 17:21:00.323061 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:21:05 crc kubenswrapper[4792]: I0318 17:21:05.155533 4792 scope.go:117] "RemoveContainer" containerID="0645656482ab0059fa311ddb6336d8a466f7ac1e1b3d4bf6b1d7225a76529ecf" Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.321553 4792 patch_prober.go:28] interesting pod/machine-config-daemon-2wtm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.322175 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.322232 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.323458 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585"} pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.323669 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerName="machine-config-daemon" containerID="cri-o://cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" gracePeriod=600 Mar 18 17:21:30 crc kubenswrapper[4792]: E0318 17:21:30.557514 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:21:30 crc kubenswrapper[4792]: E0318 17:21:30.571467 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51cef14_7d91_4e08_8045_831f7a9a65f8.slice/crio-conmon-cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51cef14_7d91_4e08_8045_831f7a9a65f8.slice/crio-cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.608348 4792 generic.go:334] "Generic (PLEG): container finished" podID="e51cef14-7d91-4e08-8045-831f7a9a65f8" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" exitCode=0 Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.608412 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" event={"ID":"e51cef14-7d91-4e08-8045-831f7a9a65f8","Type":"ContainerDied","Data":"cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585"} Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.608468 4792 scope.go:117] "RemoveContainer" containerID="74500b43621b4500ad54de67c8edb80af58c911e029926923b3f0201ec20df9a" Mar 18 17:21:30 crc kubenswrapper[4792]: I0318 17:21:30.630506 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:21:30 crc kubenswrapper[4792]: E0318 17:21:30.631439 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:21:41 crc kubenswrapper[4792]: I0318 17:21:41.862869 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:21:41 crc kubenswrapper[4792]: E0318 17:21:41.863751 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:21:53 crc kubenswrapper[4792]: I0318 17:21:53.855221 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:21:53 crc kubenswrapper[4792]: E0318 17:21:53.856956 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.162460 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564242-5h2hw"] Mar 18 17:22:00 crc kubenswrapper[4792]: E0318 17:22:00.164505 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c1b08f-b0d7-4f6e-b9db-07ce63e86300" containerName="oc" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.164584 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c1b08f-b0d7-4f6e-b9db-07ce63e86300" containerName="oc" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.164928 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c1b08f-b0d7-4f6e-b9db-07ce63e86300" containerName="oc" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.165884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.168776 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.169251 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.169300 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.176078 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564242-5h2hw"] Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.355443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qnc5\" (UniqueName: \"kubernetes.io/projected/94fe30d3-a6bc-4e0b-a55e-893e6788c633-kube-api-access-9qnc5\") pod \"auto-csr-approver-29564242-5h2hw\" (UID: \"94fe30d3-a6bc-4e0b-a55e-893e6788c633\") " pod="openshift-infra/auto-csr-approver-29564242-5h2hw" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.459006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnc5\" (UniqueName: \"kubernetes.io/projected/94fe30d3-a6bc-4e0b-a55e-893e6788c633-kube-api-access-9qnc5\") pod \"auto-csr-approver-29564242-5h2hw\" (UID: \"94fe30d3-a6bc-4e0b-a55e-893e6788c633\") " pod="openshift-infra/auto-csr-approver-29564242-5h2hw" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.486899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qnc5\" (UniqueName: \"kubernetes.io/projected/94fe30d3-a6bc-4e0b-a55e-893e6788c633-kube-api-access-9qnc5\") pod \"auto-csr-approver-29564242-5h2hw\" (UID: \"94fe30d3-a6bc-4e0b-a55e-893e6788c633\") " pod="openshift-infra/auto-csr-approver-29564242-5h2hw" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.490008 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.960195 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564242-5h2hw"] Mar 18 17:22:00 crc kubenswrapper[4792]: I0318 17:22:00.963288 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:22:01 crc kubenswrapper[4792]: E0318 17:22:01.028785 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Mar 18 17:22:01 crc kubenswrapper[4792]: I0318 17:22:01.952300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" event={"ID":"94fe30d3-a6bc-4e0b-a55e-893e6788c633","Type":"ContainerStarted","Data":"1d6d9731028ff488991ab21b403c176a277b065b05d2de70caaa2d5fc445ac92"} Mar 18 17:22:02 crc kubenswrapper[4792]: I0318 17:22:02.968730 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" event={"ID":"94fe30d3-a6bc-4e0b-a55e-893e6788c633","Type":"ContainerStarted","Data":"93a53748090450d1812012a004239031b2455cdf966a994e3d320cd92c13cddb"} Mar 18 17:22:02 crc kubenswrapper[4792]: I0318 17:22:02.999095 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" podStartSLOduration=1.768962165 podStartE2EDuration="2.999074423s" podCreationTimestamp="2026-03-18 17:22:00 +0000 UTC" firstStartedPulling="2026-03-18 17:22:00.963028259 +0000 UTC m=+6469.832357196" lastFinishedPulling="2026-03-18 17:22:02.193140507 +0000 UTC m=+6471.062469454" observedRunningTime="2026-03-18 17:22:02.985462244 +0000 UTC m=+6471.854791181" watchObservedRunningTime="2026-03-18 17:22:02.999074423 +0000 UTC m=+6471.868403360" Mar 18 17:22:03 crc kubenswrapper[4792]: I0318 17:22:03.980116 4792 generic.go:334] "Generic (PLEG): container finished" podID="94fe30d3-a6bc-4e0b-a55e-893e6788c633" containerID="93a53748090450d1812012a004239031b2455cdf966a994e3d320cd92c13cddb" exitCode=0 Mar 18 17:22:03 crc kubenswrapper[4792]: I0318 17:22:03.980229 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" event={"ID":"94fe30d3-a6bc-4e0b-a55e-893e6788c633","Type":"ContainerDied","Data":"93a53748090450d1812012a004239031b2455cdf966a994e3d320cd92c13cddb"} Mar 18 17:22:05 crc kubenswrapper[4792]: I0318 17:22:05.436450 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" Mar 18 17:22:05 crc kubenswrapper[4792]: I0318 17:22:05.602438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qnc5\" (UniqueName: \"kubernetes.io/projected/94fe30d3-a6bc-4e0b-a55e-893e6788c633-kube-api-access-9qnc5\") pod \"94fe30d3-a6bc-4e0b-a55e-893e6788c633\" (UID: \"94fe30d3-a6bc-4e0b-a55e-893e6788c633\") " Mar 18 17:22:05 crc kubenswrapper[4792]: I0318 17:22:05.609527 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94fe30d3-a6bc-4e0b-a55e-893e6788c633-kube-api-access-9qnc5" (OuterVolumeSpecName: "kube-api-access-9qnc5") pod "94fe30d3-a6bc-4e0b-a55e-893e6788c633" (UID: "94fe30d3-a6bc-4e0b-a55e-893e6788c633"). InnerVolumeSpecName "kube-api-access-9qnc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:22:05 crc kubenswrapper[4792]: I0318 17:22:05.706669 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qnc5\" (UniqueName: \"kubernetes.io/projected/94fe30d3-a6bc-4e0b-a55e-893e6788c633-kube-api-access-9qnc5\") on node \"crc\" DevicePath \"\"" Mar 18 17:22:06 crc kubenswrapper[4792]: I0318 17:22:06.013421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" event={"ID":"94fe30d3-a6bc-4e0b-a55e-893e6788c633","Type":"ContainerDied","Data":"1d6d9731028ff488991ab21b403c176a277b065b05d2de70caaa2d5fc445ac92"} Mar 18 17:22:06 crc kubenswrapper[4792]: I0318 17:22:06.013681 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6d9731028ff488991ab21b403c176a277b065b05d2de70caaa2d5fc445ac92" Mar 18 17:22:06 crc kubenswrapper[4792]: I0318 17:22:06.013491 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564242-5h2hw" Mar 18 17:22:06 crc kubenswrapper[4792]: I0318 17:22:06.046505 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-jb4sr"] Mar 18 17:22:06 crc kubenswrapper[4792]: I0318 17:22:06.057229 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564236-jb4sr"] Mar 18 17:22:06 crc kubenswrapper[4792]: I0318 17:22:06.854527 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:22:06 crc kubenswrapper[4792]: E0318 17:22:06.854878 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:22:07 crc kubenswrapper[4792]: I0318 17:22:07.870639 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c04aa62d-680a-43f2-9505-32ed4c7eef88" path="/var/lib/kubelet/pods/c04aa62d-680a-43f2-9505-32ed4c7eef88/volumes" Mar 18 17:22:19 crc kubenswrapper[4792]: I0318 17:22:19.855211 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:22:19 crc kubenswrapper[4792]: E0318 17:22:19.855958 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:22:31 crc kubenswrapper[4792]: I0318 17:22:31.862294 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:22:31 crc kubenswrapper[4792]: E0318 17:22:31.863044 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:22:44 crc kubenswrapper[4792]: I0318 17:22:44.854414 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:22:44 crc kubenswrapper[4792]: E0318 17:22:44.855483 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:22:59 crc kubenswrapper[4792]: I0318 17:22:59.854236 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:22:59 crc kubenswrapper[4792]: E0318 17:22:59.854985 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:23:05 crc kubenswrapper[4792]: I0318 17:23:05.254147 4792 scope.go:117] "RemoveContainer" containerID="ae647c07870e7b3e0ae7e021d0a11b8e20ed4c75326a4ad6703f31dca415148d" Mar 18 17:23:13 crc kubenswrapper[4792]: I0318 17:23:13.855411 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:23:13 crc kubenswrapper[4792]: E0318 17:23:13.857770 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:23:24 crc kubenswrapper[4792]: I0318 17:23:24.858948 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:23:24 crc kubenswrapper[4792]: E0318 17:23:24.860342 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:23:36 crc kubenswrapper[4792]: I0318 17:23:36.855443 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:23:36 crc kubenswrapper[4792]: E0318 17:23:36.856311 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:23:50 crc kubenswrapper[4792]: I0318 17:23:50.854738 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:23:50 crc kubenswrapper[4792]: E0318 17:23:50.856440 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.157638 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564244-cgrj4"] Mar 18 17:24:00 crc kubenswrapper[4792]: E0318 17:24:00.158777 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94fe30d3-a6bc-4e0b-a55e-893e6788c633" containerName="oc" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.158794 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="94fe30d3-a6bc-4e0b-a55e-893e6788c633" containerName="oc" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.159163 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="94fe30d3-a6bc-4e0b-a55e-893e6788c633" containerName="oc" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.160249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.163385 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.163854 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.166372 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-4j967" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.187783 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564244-cgrj4"] Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.286723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wpd8\" (UniqueName: \"kubernetes.io/projected/30042883-da05-4930-88b1-f3ecd459b354-kube-api-access-9wpd8\") pod \"auto-csr-approver-29564244-cgrj4\" (UID: \"30042883-da05-4930-88b1-f3ecd459b354\") " pod="openshift-infra/auto-csr-approver-29564244-cgrj4" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.389113 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wpd8\" (UniqueName: \"kubernetes.io/projected/30042883-da05-4930-88b1-f3ecd459b354-kube-api-access-9wpd8\") pod \"auto-csr-approver-29564244-cgrj4\" (UID: \"30042883-da05-4930-88b1-f3ecd459b354\") " pod="openshift-infra/auto-csr-approver-29564244-cgrj4" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.409785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wpd8\" (UniqueName: \"kubernetes.io/projected/30042883-da05-4930-88b1-f3ecd459b354-kube-api-access-9wpd8\") pod \"auto-csr-approver-29564244-cgrj4\" (UID: \"30042883-da05-4930-88b1-f3ecd459b354\") " pod="openshift-infra/auto-csr-approver-29564244-cgrj4" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.492881 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" Mar 18 17:24:00 crc kubenswrapper[4792]: I0318 17:24:00.956566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564244-cgrj4"] Mar 18 17:24:01 crc kubenswrapper[4792]: I0318 17:24:01.349860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" event={"ID":"30042883-da05-4930-88b1-f3ecd459b354","Type":"ContainerStarted","Data":"692e5b72a487556646033b82422c2f52b29aaf0643610d8b4033aaaaa58b9497"} Mar 18 17:24:02 crc kubenswrapper[4792]: I0318 17:24:02.365498 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" event={"ID":"30042883-da05-4930-88b1-f3ecd459b354","Type":"ContainerStarted","Data":"85fbf61302386139dba697b8e14c8c77bbe3056deb0b3e5551dd404df7cfecd7"} Mar 18 17:24:02 crc kubenswrapper[4792]: I0318 17:24:02.391816 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" podStartSLOduration=1.346624326 podStartE2EDuration="2.391793905s" podCreationTimestamp="2026-03-18 17:24:00 +0000 UTC" firstStartedPulling="2026-03-18 17:24:00.957086506 +0000 UTC m=+6589.826415443" lastFinishedPulling="2026-03-18 17:24:02.002256075 +0000 UTC m=+6590.871585022" observedRunningTime="2026-03-18 17:24:02.382369068 +0000 UTC m=+6591.251698005" watchObservedRunningTime="2026-03-18 17:24:02.391793905 +0000 UTC m=+6591.261122842" Mar 18 17:24:02 crc kubenswrapper[4792]: I0318 17:24:02.854245 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:24:02 crc kubenswrapper[4792]: E0318 17:24:02.855770 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:24:03 crc kubenswrapper[4792]: I0318 17:24:03.390547 4792 generic.go:334] "Generic (PLEG): container finished" podID="30042883-da05-4930-88b1-f3ecd459b354" containerID="85fbf61302386139dba697b8e14c8c77bbe3056deb0b3e5551dd404df7cfecd7" exitCode=0 Mar 18 17:24:03 crc kubenswrapper[4792]: I0318 17:24:03.390779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" event={"ID":"30042883-da05-4930-88b1-f3ecd459b354","Type":"ContainerDied","Data":"85fbf61302386139dba697b8e14c8c77bbe3056deb0b3e5551dd404df7cfecd7"} Mar 18 17:24:04 crc kubenswrapper[4792]: I0318 17:24:04.933781 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" Mar 18 17:24:05 crc kubenswrapper[4792]: I0318 17:24:05.117607 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wpd8\" (UniqueName: \"kubernetes.io/projected/30042883-da05-4930-88b1-f3ecd459b354-kube-api-access-9wpd8\") pod \"30042883-da05-4930-88b1-f3ecd459b354\" (UID: \"30042883-da05-4930-88b1-f3ecd459b354\") " Mar 18 17:24:05 crc kubenswrapper[4792]: I0318 17:24:05.124940 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30042883-da05-4930-88b1-f3ecd459b354-kube-api-access-9wpd8" (OuterVolumeSpecName: "kube-api-access-9wpd8") pod "30042883-da05-4930-88b1-f3ecd459b354" (UID: "30042883-da05-4930-88b1-f3ecd459b354"). InnerVolumeSpecName "kube-api-access-9wpd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 17:24:05 crc kubenswrapper[4792]: I0318 17:24:05.222097 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wpd8\" (UniqueName: \"kubernetes.io/projected/30042883-da05-4930-88b1-f3ecd459b354-kube-api-access-9wpd8\") on node \"crc\" DevicePath \"\"" Mar 18 17:24:05 crc kubenswrapper[4792]: I0318 17:24:05.416454 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" event={"ID":"30042883-da05-4930-88b1-f3ecd459b354","Type":"ContainerDied","Data":"692e5b72a487556646033b82422c2f52b29aaf0643610d8b4033aaaaa58b9497"} Mar 18 17:24:05 crc kubenswrapper[4792]: I0318 17:24:05.416501 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692e5b72a487556646033b82422c2f52b29aaf0643610d8b4033aaaaa58b9497" Mar 18 17:24:05 crc kubenswrapper[4792]: I0318 17:24:05.416516 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564244-cgrj4" Mar 18 17:24:06 crc kubenswrapper[4792]: I0318 17:24:06.025418 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-fvpxp"] Mar 18 17:24:06 crc kubenswrapper[4792]: I0318 17:24:06.045718 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564238-fvpxp"] Mar 18 17:24:07 crc kubenswrapper[4792]: I0318 17:24:07.870959 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbcbcda-17c8-448b-a8a0-272d2b92682e" path="/var/lib/kubelet/pods/acbcbcda-17c8-448b-a8a0-272d2b92682e/volumes" Mar 18 17:24:15 crc kubenswrapper[4792]: I0318 17:24:15.863362 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:24:15 crc kubenswrapper[4792]: E0318 17:24:15.864701 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:24:27 crc kubenswrapper[4792]: I0318 17:24:27.855563 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:24:27 crc kubenswrapper[4792]: E0318 17:24:27.856529 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:24:39 crc kubenswrapper[4792]: I0318 17:24:39.859194 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:24:39 crc kubenswrapper[4792]: E0318 17:24:39.860138 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:24:53 crc kubenswrapper[4792]: I0318 17:24:53.854624 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:24:53 crc kubenswrapper[4792]: E0318 17:24:53.855568 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8" Mar 18 17:25:05 crc kubenswrapper[4792]: I0318 17:25:05.353477 4792 scope.go:117] "RemoveContainer" containerID="af7012d330dbd00f1b2b1e5892a6dce35ba81054d171d63379712bee687a9a9b" Mar 18 17:25:08 crc kubenswrapper[4792]: I0318 17:25:08.854901 4792 scope.go:117] "RemoveContainer" containerID="cd98ddf402454c1316346dd19d18bb77806f2a3a18048c08398f36ae68421585" Mar 18 17:25:08 crc kubenswrapper[4792]: E0318 17:25:08.855839 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wtm6_openshift-machine-config-operator(e51cef14-7d91-4e08-8045-831f7a9a65f8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wtm6" podUID="e51cef14-7d91-4e08-8045-831f7a9a65f8"